Mar 20 13:30:19 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:30:19 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.923180 4755 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934051 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934110 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934121 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934131 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934141 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934150 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934183 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934194 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934202 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934210 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934218 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934225 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934237 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934248 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934257 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934268 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934280 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934293 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934305 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934317 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934327 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934335 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934344 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934352 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934360 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934367 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934375 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934383 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934391 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934399 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934410 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934419 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934428 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934438 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934446 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934454 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934461 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934469 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934477 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934502 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934510 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934518 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934527 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934535 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934543 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934551 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934560 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934567 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934575 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934584 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934591 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934598 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934608 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934618 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934627 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934634 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934642 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934685 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934693 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934701 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934708 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934716 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934723 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934731 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934738 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934746 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934754 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934763 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934770 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934783 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934797 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935778 4755 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935806 4755 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935822 4755 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935835 4755 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935848 4755 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935858 4755 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935871 4755 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935883 4755 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935893 4755 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935903 4755 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935913 4755 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935923 4755 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935933 4755 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935942 4755 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935952 4755 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935961 4755 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935970 4755 flags.go:64] FLAG: --cloud-config="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935979 4755 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935988 4755 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936003 4755 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936012 4755 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936022 4755 flags.go:64] FLAG: --config-dir="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936033 4755 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936044 4755 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936056 4755 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936066 4755 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936076 4755 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936085 4755 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936095 4755 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936104 4755 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936114 4755 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936125 4755 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936136 4755 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936148 4755 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936157 4755 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936166 4755 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936176 4755 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936185 4755 flags.go:64] FLAG: --enable-server="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936195 4755 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936945 4755 flags.go:64] FLAG: --event-burst="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936957 4755 flags.go:64] FLAG: --event-qps="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936968 4755 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936978 4755 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936988 4755 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937001 4755 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937011 4755 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937021 4755 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937031 4755 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937040 4755 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937050 4755 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937059 4755 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937069 4755 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937079 4755 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937088 4755 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937097 4755 flags.go:64] FLAG: --feature-gates="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937109 4755 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937118 4755 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937129 4755 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937138 4755 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937148 4755 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937157 4755 flags.go:64] FLAG: --help="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937167 4755 flags.go:64] FLAG: --hostname-override="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937177 4755 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937188 4755 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937198 4755 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937208 4755 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937217 4755 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937226 4755 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937238 4755 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937250 4755 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937261 4755 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937283 4755 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937304 4755 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937317 4755 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937329 4755 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937340 4755 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937352 4755 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937365 4755 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937376 4755 flags.go:64] FLAG: --lock-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937386 4755 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937398 4755 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937410 4755 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937428 4755 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937440 4755 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937451 4755 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937463 4755 flags.go:64] FLAG: --logging-format="text" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937475 4755 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937488 4755 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937499 4755 flags.go:64] FLAG: --manifest-url="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937510 4755 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937523 4755 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937533 4755 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937545 4755 flags.go:64] FLAG: --max-pods="110" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937555 4755 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937565 4755 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937574 4755 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937596 4755 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937606 4755 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937616 4755 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937625 4755 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937684 4755 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937697 4755 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937710 4755 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937721 4755 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937737 4755 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937757 4755 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937768 4755 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937779 4755 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937788 4755 flags.go:64] FLAG: --port="10250" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937798 4755 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937807 4755 flags.go:64] FLAG: --provider-id="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937815 4755 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937825 4755 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937835 4755 flags.go:64] FLAG: --register-node="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937844 4755 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937853 4755 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937869 4755 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937878 4755 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937887 4755 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937896 4755 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937908 4755 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937918 4755 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937928 4755 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937938 4755 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937949 4755 flags.go:64] FLAG: --runonce="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937958 4755 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937970 4755 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937980 4755 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937989 4755 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938001 4755 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938011 4755 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938021 4755 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938030 4755 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938039 4755 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938049 4755 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938058 4755 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938067 4755 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938076 4755 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938086 4755 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938095 4755 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938111 4755 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938121 4755 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938132 4755 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938161 4755 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938175 4755 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938188 4755 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938199 4755 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938210 4755 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938222 4755 flags.go:64] FLAG: --v="2" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938238 4755 flags.go:64] FLAG: --version="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938253 4755 flags.go:64] FLAG: --vmodule="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938267 4755 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938280 4755 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938517 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938530 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938542 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938553 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938562 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938570 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938578 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938587 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938596 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938603 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938611 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938618 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938626 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938634 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938642 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938690 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938701 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938721 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938737 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938747 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938756 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938766 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938775 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938788 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938799 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938817 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938826 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938839 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938852 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938865 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938877 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938887 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938898 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938909 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938917 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938926 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938934 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938942 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938950 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938958 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938966 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938977 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938987 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938995 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939005 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939013 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939022 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939072 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939080 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939090 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939098 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939105 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939113 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939121 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939129 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939136 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939144 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939155 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939163 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939173 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939181 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939188 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939196 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939203 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939211 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939222 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939231 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939239 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939247 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939255 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939263 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.939278 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.960497 4755 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.960563 4755 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960757 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960775 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960786 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960798 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960809 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960819 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960829 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960840 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960851 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960862 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960872 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960882 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960892 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960901 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960912 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960925 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960944 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960958 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960970 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960983 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960994 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961009 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961021 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961032 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961043 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961054 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961064 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961074 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961083 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961094 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961104 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961114 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961124 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961134 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961146 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961157 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961167 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961176 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961190 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961203 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961216 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961226 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961238 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961249 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961259 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961270 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961281 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961291 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961301 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961311 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961322 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961332 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961343 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961357 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961368 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961377 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961387 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961397 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961407 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961416 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961426 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961436 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961444 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961454 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961464 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961473 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961482 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961491 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961500 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961510 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961522 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.961539 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961870 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961890 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961900 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961910 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961920 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961930 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961941 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961951 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961962 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961972 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961983 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961993 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962002 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962013 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962025 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962036 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962046 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962056 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962069 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962083 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962094 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962104 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962114 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962124 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962134 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962144 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962154 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962164 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962175 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962184 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962194 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962205 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962215 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962224 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962236 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962246 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962257 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962268 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962277 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962288 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962298 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962309 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962319 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962333 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962347 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962359 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962371 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962382 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962394 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962404 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962414 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962427 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962439 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962450 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962462 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962473 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962484 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962494 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962504 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962513 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962523 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962533 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962543 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962556 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962569 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962580 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962591 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962601 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962611 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962621 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962633 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.962682 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.964400 4755 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:30:20 crc kubenswrapper[4755]: E0320 13:30:20.970044 4755 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.977174 4755 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.977349 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.979133 4755 server.go:997] "Starting client certificate rotation" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.979173 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.980597 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.010435 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.014244 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.018964 4755 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.039318 4755 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.078523 4755 log.go:25] "Validated CRI v1 image API" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.082253 4755 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.090901 4755 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-24-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.090962 4755 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.121909 4755 manager.go:217] Machine: {Timestamp:2026-03-20 13:30:21.116811991 +0000 UTC m=+0.714744580 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ec91ed1b-a6ed-4cb2-884d-632a869fcc2d BootID:382501ad-cb22-4ccb-a572-771d7a82be1e Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:25:d0:34 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:25:d0:34 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:67:3a:c7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b6:07:44 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a0:4d:b7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c9:38:ec Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9a:dc:16:6b:b6:6c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:36:a1:c7:80:b2:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.122324 4755 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.122534 4755 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123027 4755 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123397 4755 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123457 4755 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123883 4755 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123905 4755 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124589 4755 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124648 4755 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124950 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.125100 4755 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130023 4755 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130065 4755 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130112 4755 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130138 4755 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130160 4755 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.135785 4755 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.141047 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.141259 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.141561 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.141641 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.145954 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.149590 4755 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151590 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151719 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151789 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151845 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151899 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151949 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152013 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152070 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152124 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152187 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152244 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152300 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.155919 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.156816 4755 server.go:1280] "Started kubelet" Mar 20 13:30:21 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159729 4755 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159736 4755 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159985 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160030 4755 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.160505 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160583 4755 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160635 4755 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160689 4755 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.161386 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="200ms" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.161439 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.161541 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.161767 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.161777 4755 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162002 4755 factory.go:55] Registering systemd factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162101 4755 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162101 4755 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162382 4755 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162745 4755 factory.go:153] Registering CRI-O factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162904 4755 factory.go:221] Registration of the crio container factory successfully Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.163043 4755 factory.go:103] Registering Raw factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.163206 4755 manager.go:1196] Started watching for new ooms in manager Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.164017 4755 manager.go:319] Starting recovery of all containers Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.171077 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180786 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180899 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180940 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.181258 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.181994 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182168 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182321 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182395 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182476 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182503 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182538 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182568 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182636 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182767 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182849 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182897 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182936 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183047 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183190 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183265 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183346 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183443 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183479 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183522 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183554 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183611 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183645 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183737 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183764 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183802 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183828 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183860 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183903 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183925 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183955 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183977 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184012 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184037 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184066 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184114 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184141 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184172 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184196 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184223 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184269 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184299 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184327 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184349 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184375 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184442 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184484 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184520 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184546 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184581 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185409 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185461 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185477 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185494 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185506 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185519 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185530 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185542 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185554 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185569 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185581 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185591 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185603 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185615 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185628 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185676 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185692 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185707 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185722 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185744 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185757 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185782 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185797 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185814 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185831 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185848 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185868 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186274 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186312 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186328 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186345 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186362 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186377 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186389 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186420 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186432 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186448 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186473 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186488 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186503 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186519 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186532 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186546 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186561 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186576 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186591 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186621 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186644 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186684 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186704 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186722 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186739 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186754 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186770 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186793 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186844 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186861 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186878 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186896 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186949 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186964 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186978 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186992 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187007 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187021 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187038 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187052 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187066 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187113 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187128 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187142 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187156 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187173 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187190 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187209 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187227 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187248 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187263 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187424 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187439 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187454 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187474 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187512 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187528 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187543 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187559 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187577 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187592 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187607 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187627 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187739 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187757 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187773 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187790 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187806 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187823 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187840 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187856 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187872 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187886 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187901 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187916 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187937 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187956 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187979 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188026 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188044 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188059 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188075 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188092 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188107 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188122 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188138 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188156 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188177 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188194 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188211 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188227 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188243 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188262 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188279 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188297 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188314 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188331 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188346 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188365 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188382 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188396 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188415 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188428 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188445 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188461 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188476 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188512 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188528 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188548 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188565 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188582 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188597 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188613 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188628 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192352 4755 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192599 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192707 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192787 4755 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192853 4755 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.198767 4755 manager.go:324] Recovery completed Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.211919 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.214963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.215020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.215040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217441 4755 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217475 4755 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217893 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.221316 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224272 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224338 4755 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224367 4755 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.224543 4755 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.225420 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.225553 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.236702 4755 policy_none.go:49] "None policy: Start" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.237845 4755 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.237886 4755 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.261372 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.294506 4755 manager.go:334] "Starting Device Plugin manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295016 4755 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295056 4755 server.go:79] "Starting device plugin registration server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295907 4755 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295949 4755 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296286 4755 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296440 4755 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296463 4755 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.312880 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.325630 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.325771 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327367 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327684 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328412 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329513 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329775 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.330696 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.330771 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331155 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331363 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331463 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331504 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.332960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333318 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333568 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333693 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335347 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335693 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335761 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.363506 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="400ms" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396401 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396754 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396826 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397037 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397435 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.398015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.398058 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.398691 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499499 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499615 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499743 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499876 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499992 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499777 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500210 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500395 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500458 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.599108 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601308 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.602000 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.653969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.660642 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.678244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.698617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.707749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.723370 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43 WatchSource:0}: Error finding container 64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43: Status 404 returned error can't find the container with id 64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.730227 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a WatchSource:0}: Error finding container 9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a: Status 404 returned error can't find the container with id 9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.737298 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14 WatchSource:0}: Error finding container 5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14: Status 404 returned error can't find the container with id 5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.745291 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75 WatchSource:0}: Error finding container 57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75: Status 404 returned error can't find the container with id 57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.748823 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7 WatchSource:0}: Error finding container f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7: Status 404 returned error can't find the container with id f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7 Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.764719 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="800ms" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.002928 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.005026 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.005635 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.043220 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.043363 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.163380 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.229808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.230939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.232012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.233251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.234088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14"} Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.381818 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.381910 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.566056 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="1.6s" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.568739 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.568831 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.623587 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.623985 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.806095 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808990 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.809600 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.055498 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:23 crc kubenswrapper[4755]: E0320 13:30:23.056977 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.163109 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.240053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.242764 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.242917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.243014 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.248331 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251827 4755 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251906 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252562 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252585 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.253864 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.253933 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.254041 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257017 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257144 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.163272 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.167333 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="3.2s" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264906 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267563 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97" exitCode=0 Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267620 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267827 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.270927 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.270927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274250 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.410253 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411663 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411812 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.412645 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:24 crc kubenswrapper[4755]: W0320 13:30:24.559021 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.559138 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:25 crc kubenswrapper[4755]: W0320 13:30:25.057731 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:25 crc kubenswrapper[4755]: E0320 13:30:25.057892 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.163040 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.282203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786"} Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.283195 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287151 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c" exitCode=0 Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287319 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287368 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287325 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287448 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287640 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c"} Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.290539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.290607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.523747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.740898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.293915 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.293970 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294547 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294698 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294834 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295073 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295438 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4"} Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305802 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305920 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305948 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.317775 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.476068 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.476330 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.613195 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615504 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.308249 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.741251 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.741370 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.745465 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.745711 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.795825 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.796150 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.530708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.530937 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:31 crc kubenswrapper[4755]: E0320 13:30:31.313051 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.225759 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.226082 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229853 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.236331 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.318774 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.327350 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.962489 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.322078 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.324865 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.214967 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.215074 4755 trace.go:236] Trace[655135621]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:30:25.213) (total time: 10002ms): Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[655135621]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:30:35.214) Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[655135621]: [10.002005968s] [10.002005968s] END Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.215103 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.728804 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.729605 4755 trace.go:236] Trace[410276888]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:30:25.725) (total time: 10004ms): Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[410276888]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (13:30:35.728) Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[410276888]: [10.004023796s] [10.004023796s] END Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.729870 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.734568 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.736189 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.738193 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.738322 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.738275 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.739633 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.739740 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.742039 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.746726 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.751744 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.751827 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.756696 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.756765 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.168346 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:36Z is after 2026-02-23T05:33:13Z Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.332325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334314 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" exitCode=255 Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786"} Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334541 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.336027 4755 scope.go:117] "RemoveContainer" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.463917 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.464216 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.516208 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.968421 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.166595 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:37Z is after 2026-02-23T05:33:13Z Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.340152 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.342097 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343412 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163"} Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344490 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.363990 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.167702 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:38Z is after 2026-02-23T05:33:13Z Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.349010 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.350048 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352754 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" exitCode=255 Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163"} Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352933 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352967 4755 scope.go:117] "RemoveContainer" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352934 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.355098 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:38 crc kubenswrapper[4755]: E0320 13:30:38.355356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.742960 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.743196 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.746228 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.166460 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:39Z is after 2026-02-23T05:33:13Z Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.358601 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.360942 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.363443 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:39 crc kubenswrapper[4755]: E0320 13:30:39.363800 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.805043 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:40 crc kubenswrapper[4755]: W0320 13:30:40.041769 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.041885 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.165425 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.363758 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.366138 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.366520 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.371431 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:40 crc kubenswrapper[4755]: W0320 13:30:40.773914 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.774081 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.165757 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:41Z is after 2026-02-23T05:33:13Z Mar 20 13:30:41 crc kubenswrapper[4755]: E0320 13:30:41.313930 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.366290 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.368912 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:41 crc kubenswrapper[4755]: E0320 13:30:41.369259 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.136719 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.137901 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138124 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.144324 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.165614 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.368559 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.370321 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.370509 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:43 crc kubenswrapper[4755]: I0320 13:30:43.168949 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:43Z is after 2026-02-23T05:33:13Z Mar 20 13:30:44 crc kubenswrapper[4755]: I0320 13:30:44.166296 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:44Z is after 2026-02-23T05:33:13Z Mar 20 13:30:44 crc kubenswrapper[4755]: I0320 13:30:44.253092 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:44 crc kubenswrapper[4755]: E0320 13:30:44.256631 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:45 crc kubenswrapper[4755]: I0320 13:30:45.166976 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:45Z is after 2026-02-23T05:33:13Z Mar 20 13:30:45 crc kubenswrapper[4755]: E0320 13:30:45.742258 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.167799 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: W0320 13:30:46.608097 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.608229 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:46 crc kubenswrapper[4755]: W0320 13:30:46.698726 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.698829 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.967858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.968075 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969472 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.970033 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.970199 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:47 crc kubenswrapper[4755]: I0320 13:30:47.165384 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z Mar 20 13:30:47 crc kubenswrapper[4755]: W0320 13:30:47.314282 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z Mar 20 13:30:47 crc kubenswrapper[4755]: E0320 13:30:47.314405 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.164756 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:48Z is after 2026-02-23T05:33:13Z Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.742082 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.742842 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.743124 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.743460 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.746262 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.746615 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" gracePeriod=30 Mar 20 13:30:49 crc kubenswrapper[4755]: W0320 13:30:49.124718 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.124879 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.143777 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.144704 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146971 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.152703 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.169709 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.392626 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393255 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" exitCode=255 Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393453 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:50 crc kubenswrapper[4755]: I0320 13:30:50.166462 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:50Z is after 2026-02-23T05:33:13Z Mar 20 13:30:51 crc kubenswrapper[4755]: I0320 13:30:51.167057 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:51Z is after 2026-02-23T05:33:13Z Mar 20 13:30:51 crc kubenswrapper[4755]: E0320 13:30:51.314088 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.168340 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:52Z is after 2026-02-23T05:33:13Z Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.961821 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.962065 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:53 crc kubenswrapper[4755]: I0320 13:30:53.168917 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:53Z is after 2026-02-23T05:33:13Z Mar 20 13:30:54 crc kubenswrapper[4755]: I0320 13:30:54.167818 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:54Z is after 2026-02-23T05:33:13Z Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.167980 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:55Z is after 2026-02-23T05:33:13Z Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.741704 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.741942 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:55 crc kubenswrapper[4755]: E0320 13:30:55.748609 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:56 crc kubenswrapper[4755]: E0320 13:30:56.147321 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.153792 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155356 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155427 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:56 crc kubenswrapper[4755]: E0320 13:30:56.160296 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.167192 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z Mar 20 13:30:57 crc kubenswrapper[4755]: I0320 13:30:57.167425 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:57Z is after 2026-02-23T05:33:13Z Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.168054 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:58Z is after 2026-02-23T05:33:13Z Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.225196 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.227726 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.741866 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.742020 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.169623 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:59Z is after 2026-02-23T05:33:13Z Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.429464 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.431708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1"} Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.431887 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.167906 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:00Z is after 2026-02-23T05:33:13Z Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.437801 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.438693 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441631 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" exitCode=255 Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1"} Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441781 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441987 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443314 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.444384 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:00 crc kubenswrapper[4755]: E0320 13:31:00.444796 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.166761 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:01Z is after 2026-02-23T05:33:13Z Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.314581 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.324355 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.330807 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.332130 4755 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.448341 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:02 crc kubenswrapper[4755]: I0320 13:31:02.168288 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:02Z is after 2026-02-23T05:33:13Z Mar 20 13:31:03 crc kubenswrapper[4755]: E0320 13:31:03.153754 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.161183 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162556 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.167335 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z Mar 20 13:31:03 crc kubenswrapper[4755]: E0320 13:31:03.167738 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:04 crc kubenswrapper[4755]: I0320 13:31:04.166300 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:04Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: I0320 13:31:05.168341 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: W0320 13:31:05.269704 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: E0320 13:31:05.269829 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:05 crc kubenswrapper[4755]: E0320 13:31:05.754689 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:06 crc kubenswrapper[4755]: W0320 13:31:06.139060 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z Mar 20 13:31:06 crc kubenswrapper[4755]: E0320 13:31:06.139206 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.168991 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.967702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.967983 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.970708 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:06 crc kubenswrapper[4755]: E0320 13:31:06.971177 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:07 crc kubenswrapper[4755]: I0320 13:31:07.167150 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:07Z is after 2026-02-23T05:33:13Z Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.168522 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:08Z is after 2026-02-23T05:33:13Z Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.741824 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.741976 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.746220 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.746482 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748323 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.749245 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:08 crc kubenswrapper[4755]: E0320 13:31:08.749540 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:09 crc kubenswrapper[4755]: I0320 13:31:09.166361 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:09Z is after 2026-02-23T05:33:13Z Mar 20 13:31:10 crc kubenswrapper[4755]: E0320 13:31:10.159885 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.167878 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.167973 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169769 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:10 crc kubenswrapper[4755]: E0320 13:31:10.175114 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:11 crc kubenswrapper[4755]: W0320 13:31:11.008031 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.008143 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:11 crc kubenswrapper[4755]: W0320 13:31:11.114931 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.115084 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:11 crc kubenswrapper[4755]: I0320 13:31:11.167410 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.314735 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:12 crc kubenswrapper[4755]: I0320 13:31:12.168054 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:12Z is after 2026-02-23T05:33:13Z Mar 20 13:31:13 crc kubenswrapper[4755]: I0320 13:31:13.167455 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:13Z is after 2026-02-23T05:33:13Z Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.167765 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:14Z is after 2026-02-23T05:33:13Z Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.260836 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.261125 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262846 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:15 crc kubenswrapper[4755]: I0320 13:31:15.167812 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:15Z is after 2026-02-23T05:33:13Z Mar 20 13:31:15 crc kubenswrapper[4755]: E0320 13:31:15.761835 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:16 crc kubenswrapper[4755]: I0320 13:31:16.167744 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:16Z is after 2026-02-23T05:33:13Z Mar 20 13:31:17 crc kubenswrapper[4755]: E0320 13:31:17.165225 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.167753 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.175900 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177988 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:17 crc kubenswrapper[4755]: E0320 13:31:17.181394 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.168432 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:18Z is after 2026-02-23T05:33:13Z Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742007 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742148 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742234 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742484 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744947 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.745109 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8" gracePeriod=30 Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.169205 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:19Z is after 2026-02-23T05:33:13Z Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.520906 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.523081 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.523971 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8" exitCode=255 Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524052 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3"} Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524148 4755 scope.go:117] "RemoveContainer" containerID="ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524284 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.525949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.526015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.526042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:20 crc kubenswrapper[4755]: I0320 13:31:20.164766 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:20Z is after 2026-02-23T05:33:13Z Mar 20 13:31:20 crc kubenswrapper[4755]: I0320 13:31:20.530306 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.167599 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:13Z Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.224640 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.227379 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:21 crc kubenswrapper[4755]: E0320 13:31:21.315104 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.536938 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.538155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e"} Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.538306 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539026 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.167948 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:13Z Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.545105 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.545961 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548690 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" exitCode=255 Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e"} Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548786 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.549024 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.551458 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:22 crc kubenswrapper[4755]: E0320 13:31:22.551615 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.962415 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.962722 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:23 crc kubenswrapper[4755]: I0320 13:31:23.168031 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:23 crc kubenswrapper[4755]: I0320 13:31:23.555043 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:31:24 crc kubenswrapper[4755]: E0320 13:31:24.167766 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.167896 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.181930 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183095 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183129 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183162 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:24 crc kubenswrapper[4755]: E0320 13:31:24.190959 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.167876 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.741755 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.742030 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.771242 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.776476 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.781837 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.787087 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.792934 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fcebc88d49d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.319672989 +0000 UTC m=+0.917605528,LastTimestamp:2026-03-20 13:30:21.319672989 +0000 UTC m=+0.917605528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.800187 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.327144893 +0000 UTC m=+0.925077422,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.806813 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.327171834 +0000 UTC m=+0.925104363,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.811988 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.327182474 +0000 UTC m=+0.925115003,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.817982 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.32840772 +0000 UTC m=+0.926340249,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.822543 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.32841782 +0000 UTC m=+0.926350349,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.826747 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.328426481 +0000 UTC m=+0.926359000,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.832185 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.329498603 +0000 UTC m=+0.927431142,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.840573 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.329522533 +0000 UTC m=+0.927455072,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.846025 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.329534333 +0000 UTC m=+0.927466872,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.852371 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.331199138 +0000 UTC m=+0.929131677,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.857435 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.331213869 +0000 UTC m=+0.929146408,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.863248 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.331224599 +0000 UTC m=+0.929157138,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.867500 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.333021076 +0000 UTC m=+0.930953615,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.871191 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.333035517 +0000 UTC m=+0.930968056,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.876490 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.333046477 +0000 UTC m=+0.930979016,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.885829 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.333910635 +0000 UTC m=+0.931843174,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.891939 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.333925125 +0000 UTC m=+0.931857664,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.896823 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.333935245 +0000 UTC m=+0.931867784,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.902952 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.335036238 +0000 UTC m=+0.932968777,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.909908 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.335050168 +0000 UTC m=+0.932982707,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.918140 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fced5103449 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.731198025 +0000 UTC m=+1.329130564,LastTimestamp:2026-03-20 13:30:21.731198025 +0000 UTC m=+1.329130564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.923422 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fced568a5bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.736994235 +0000 UTC m=+1.334926764,LastTimestamp:2026-03-20 13:30:21.736994235 +0000 UTC m=+1.334926764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.929352 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fced5977e8f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.740064399 +0000 UTC m=+1.337996958,LastTimestamp:2026-03-20 13:30:21.740064399 +0000 UTC m=+1.337996958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.935705 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fced627be29 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.749517865 +0000 UTC m=+1.347450424,LastTimestamp:2026-03-20 13:30:21.749517865 +0000 UTC m=+1.347450424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.942578 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fced691c4e0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.7564664 +0000 UTC m=+1.354398969,LastTimestamp:2026-03-20 13:30:21.7564664 +0000 UTC m=+1.354398969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.948137 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf1231daf6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.756813558 +0000 UTC m=+2.354746127,LastTimestamp:2026-03-20 13:30:22.756813558 +0000 UTC m=+2.354746127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.952735 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf13b67c8c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782282892 +0000 UTC m=+2.380215461,LastTimestamp:2026-03-20 13:30:22.782282892 +0000 UTC m=+2.380215461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.956914 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf13ba920a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782550538 +0000 UTC m=+2.380483097,LastTimestamp:2026-03-20 13:30:22.782550538 +0000 UTC m=+2.380483097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.961788 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf13c0b3b9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782952377 +0000 UTC m=+2.380884946,LastTimestamp:2026-03-20 13:30:22.782952377 +0000 UTC m=+2.380884946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.968681 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf13c11c91 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782979217 +0000 UTC m=+2.380911776,LastTimestamp:2026-03-20 13:30:22.782979217 +0000 UTC m=+2.380911776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.976816 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf200d7b27 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.989310759 +0000 UTC m=+2.587243318,LastTimestamp:2026-03-20 13:30:22.989310759 +0000 UTC m=+2.587243318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.983949 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf27b70181 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.117861249 +0000 UTC m=+2.715793808,LastTimestamp:2026-03-20 13:30:23.117861249 +0000 UTC m=+2.715793808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.987910 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf280e62ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.123587756 +0000 UTC m=+2.721520315,LastTimestamp:2026-03-20 13:30:23.123587756 +0000 UTC m=+2.721520315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.992901 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf2813d9f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.123945974 +0000 UTC m=+2.721878543,LastTimestamp:2026-03-20 13:30:23.123945974 +0000 UTC m=+2.721878543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.996475 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf281d1979 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.124552057 +0000 UTC m=+2.722484626,LastTimestamp:2026-03-20 13:30:23.124552057 +0000 UTC m=+2.722484626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.000767 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf28517c01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,LastTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.004742 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf2f75c2bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.247803069 +0000 UTC m=+2.845735598,LastTimestamp:2026-03-20 13:30:23.247803069 +0000 UTC m=+2.845735598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.008900 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf2fcedf50 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.253643088 +0000 UTC m=+2.851575617,LastTimestamp:2026-03-20 13:30:23.253643088 +0000 UTC m=+2.851575617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.013753 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf3026ebbc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.259413436 +0000 UTC m=+2.857346005,LastTimestamp:2026-03-20 13:30:23.259413436 +0000 UTC m=+2.857346005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.017490 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf303558a1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.260358817 +0000 UTC m=+2.858291386,LastTimestamp:2026-03-20 13:30:23.260358817 +0000 UTC m=+2.858291386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.021156 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf3d530922 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.480408354 +0000 UTC m=+3.078340883,LastTimestamp:2026-03-20 13:30:23.480408354 +0000 UTC m=+3.078340883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.026084 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3d634259 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,LastTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.030531 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3d63e46c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481513068 +0000 UTC m=+3.079445597,LastTimestamp:2026-03-20 13:30:23.481513068 +0000 UTC m=+3.079445597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.035418 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf3d6a07bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481915327 +0000 UTC m=+3.079847856,LastTimestamp:2026-03-20 13:30:23.481915327 +0000 UTC m=+3.079847856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.039846 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf3dbbc93c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.487273276 +0000 UTC m=+3.085205815,LastTimestamp:2026-03-20 13:30:23.487273276 +0000 UTC m=+3.085205815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.044634 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3e09484e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.492352078 +0000 UTC m=+3.090284607,LastTimestamp:2026-03-20 13:30:23.492352078 +0000 UTC m=+3.090284607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.048571 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3e163ba8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.493200808 +0000 UTC m=+3.091133337,LastTimestamp:2026-03-20 13:30:23.493200808 +0000 UTC m=+3.091133337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.053038 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e469596 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,LastTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.058201 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e538dda openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.497219546 +0000 UTC m=+3.095152075,LastTimestamp:2026-03-20 13:30:23.497219546 +0000 UTC m=+3.095152075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.064143 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf40083710 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.52583656 +0000 UTC m=+3.123769089,LastTimestamp:2026-03-20 13:30:23.52583656 +0000 UTC m=+3.123769089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.068610 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4031bb1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.52855734 +0000 UTC m=+3.126489869,LastTimestamp:2026-03-20 13:30:23.52855734 +0000 UTC m=+3.126489869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.073144 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf403a414f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.529115983 +0000 UTC m=+3.127048512,LastTimestamp:2026-03-20 13:30:23.529115983 +0000 UTC m=+3.127048512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.078047 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4056bf88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.530983304 +0000 UTC m=+3.128915833,LastTimestamp:2026-03-20 13:30:23.530983304 +0000 UTC m=+3.128915833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.082564 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4b256111 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.712297233 +0000 UTC m=+3.310229802,LastTimestamp:2026-03-20 13:30:23.712297233 +0000 UTC m=+3.310229802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.088082 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4b81cbad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.718353837 +0000 UTC m=+3.316286366,LastTimestamp:2026-03-20 13:30:23.718353837 +0000 UTC m=+3.316286366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.094093 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4c887f70 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.735570288 +0000 UTC m=+3.333502847,LastTimestamp:2026-03-20 13:30:23.735570288 +0000 UTC m=+3.333502847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.098051 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4ca21654 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.737247316 +0000 UTC m=+3.335179845,LastTimestamp:2026-03-20 13:30:23.737247316 +0000 UTC m=+3.335179845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.102841 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4d049fbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.743705019 +0000 UTC m=+3.341637558,LastTimestamp:2026-03-20 13:30:23.743705019 +0000 UTC m=+3.341637558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.108478 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4d2b82cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.746253515 +0000 UTC m=+3.344186054,LastTimestamp:2026-03-20 13:30:23.746253515 +0000 UTC m=+3.344186054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.113789 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4d57b66f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.749150319 +0000 UTC m=+3.347082848,LastTimestamp:2026-03-20 13:30:23.749150319 +0000 UTC m=+3.347082848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.118366 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4f0dd58f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.777863055 +0000 UTC m=+3.375795584,LastTimestamp:2026-03-20 13:30:23.777863055 +0000 UTC m=+3.375795584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.123685 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4f27ad05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.779556613 +0000 UTC m=+3.377489152,LastTimestamp:2026-03-20 13:30:23.779556613 +0000 UTC m=+3.377489152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.128720 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf59d18c94 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.958461588 +0000 UTC m=+3.556394117,LastTimestamp:2026-03-20 13:30:23.958461588 +0000 UTC m=+3.556394117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.138908 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf5a5b7c56 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.967501398 +0000 UTC m=+3.565433927,LastTimestamp:2026-03-20 13:30:23.967501398 +0000 UTC m=+3.565433927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.143542 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf5b6859cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.985121739 +0000 UTC m=+3.583054268,LastTimestamp:2026-03-20 13:30:23.985121739 +0000 UTC m=+3.583054268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.147707 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf5bb4f1ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.99014137 +0000 UTC m=+3.588073899,LastTimestamp:2026-03-20 13:30:23.99014137 +0000 UTC m=+3.588073899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.151975 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5c79045b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.002991195 +0000 UTC m=+3.600923724,LastTimestamp:2026-03-20 13:30:24.002991195 +0000 UTC m=+3.600923724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.156277 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5dc1e79e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.024545182 +0000 UTC m=+3.622477711,LastTimestamp:2026-03-20 13:30:24.024545182 +0000 UTC m=+3.622477711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.160170 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5dd735ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.025941454 +0000 UTC m=+3.623873993,LastTimestamp:2026-03-20 13:30:24.025941454 +0000 UTC m=+3.623873993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.166226 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.166179 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6a6a16de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.236893918 +0000 UTC m=+3.834826477,LastTimestamp:2026-03-20 13:30:24.236893918 +0000 UTC m=+3.834826477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.169966 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b766474 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.254477428 +0000 UTC m=+3.852409967,LastTimestamp:2026-03-20 13:30:24.254477428 +0000 UTC m=+3.852409967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.175016 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b8ba7db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,LastTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.181041 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf6c6e562a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.270726698 +0000 UTC m=+3.868659257,LastTimestamp:2026-03-20 13:30:24.270726698 +0000 UTC m=+3.868659257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.186724 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf78b4dcb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,LastTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.193063 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf78f6a60e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.480986638 +0000 UTC m=+4.078919167,LastTimestamp:2026-03-20 13:30:24.480986638 +0000 UTC m=+4.078919167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.194805 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf79938738 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,LastTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.200743 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf7a15ddbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.499809725 +0000 UTC m=+4.097742254,LastTimestamp:2026-03-20 13:30:24.499809725 +0000 UTC m=+4.097742254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.206889 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfa936342d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.290458157 +0000 UTC m=+4.888390726,LastTimestamp:2026-03-20 13:30:25.290458157 +0000 UTC m=+4.888390726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.212065 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb6898698 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.514022552 +0000 UTC m=+5.111955081,LastTimestamp:2026-03-20 13:30:25.514022552 +0000 UTC m=+5.111955081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.216182 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb747bda8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.526488488 +0000 UTC m=+5.124421017,LastTimestamp:2026-03-20 13:30:25.526488488 +0000 UTC m=+5.124421017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.220670 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb75abf96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.527734166 +0000 UTC m=+5.125666695,LastTimestamp:2026-03-20 13:30:25.527734166 +0000 UTC m=+5.125666695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.226097 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc4d36f69 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.753747305 +0000 UTC m=+5.351679874,LastTimestamp:2026-03-20 13:30:25.753747305 +0000 UTC m=+5.351679874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.230634 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc58d1b0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.765915404 +0000 UTC m=+5.363847973,LastTimestamp:2026-03-20 13:30:25.765915404 +0000 UTC m=+5.363847973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.235446 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc5a51121 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.767485729 +0000 UTC m=+5.365418248,LastTimestamp:2026-03-20 13:30:25.767485729 +0000 UTC m=+5.365418248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.239792 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd3377299 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.995182745 +0000 UTC m=+5.593115274,LastTimestamp:2026-03-20 13:30:25.995182745 +0000 UTC m=+5.593115274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.243806 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd4217b55 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.010520405 +0000 UTC m=+5.608452934,LastTimestamp:2026-03-20 13:30:26.010520405 +0000 UTC m=+5.608452934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.250314 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd43202df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.011603679 +0000 UTC m=+5.609536208,LastTimestamp:2026-03-20 13:30:26.011603679 +0000 UTC m=+5.609536208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.255101 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe30318fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.260187388 +0000 UTC m=+5.858119917,LastTimestamp:2026-03-20 13:30:26.260187388 +0000 UTC m=+5.858119917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.259144 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe4323456 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.280051798 +0000 UTC m=+5.877984327,LastTimestamp:2026-03-20 13:30:26.280051798 +0000 UTC m=+5.877984327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.266107 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe45e94f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.282960113 +0000 UTC m=+5.880892642,LastTimestamp:2026-03-20 13:30:26.282960113 +0000 UTC m=+5.880892642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.270645 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcff14b2bb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.499791798 +0000 UTC m=+6.097724337,LastTimestamp:2026-03-20 13:30:26.499791798 +0000 UTC m=+6.097724337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.278239 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcff2685703 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.518480643 +0000 UTC m=+6.116413192,LastTimestamp:2026-03-20 13:30:26.518480643 +0000 UTC m=+6.116413192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.285882 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.289483 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.295589 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c1a904 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:31:26 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:31:26 crc kubenswrapper[4755]: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.75180314 +0000 UTC m=+15.349735679,LastTimestamp:2026-03-20 13:30:35.75180314 +0000 UTC m=+15.349735679,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.296775 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c33957 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,LastTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.299855 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189e8fd2190d0fd8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:31:26 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:31:26 crc kubenswrapper[4755]: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.756744664 +0000 UTC m=+15.354677203,LastTimestamp:2026-03-20 13:30:35.756744664 +0000 UTC m=+15.354677203,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.305281 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fd218c33957\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c33957 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,LastTimestamp:2026-03-20 13:30:35.756803085 +0000 UTC m=+15.354735634,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.309313 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf6b8ba7db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b8ba7db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,LastTimestamp:2026-03-20 13:30:36.337002727 +0000 UTC m=+15.934935256,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.313712 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf78b4dcb5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf78b4dcb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,LastTimestamp:2026-03-20 13:30:36.589389998 +0000 UTC m=+16.187322527,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.317821 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf79938738\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf79938738 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,LastTimestamp:2026-03-20 13:30:36.606442021 +0000 UTC m=+16.204374550,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.322395 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:38.743146511 +0000 UTC m=+18.341079080,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.327018 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:38.743235624 +0000 UTC m=+18.341168183,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.332701 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:48.742787215 +0000 UTC m=+28.340719784,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.337862 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:48.74296303 +0000 UTC m=+28.340895599,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.343031 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd51f4deae4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:48.746560228 +0000 UTC m=+28.344492797,LastTimestamp:2026-03-20 13:30:48.746560228 +0000 UTC m=+28.344492797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.349286 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf28517c01\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf28517c01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,LastTimestamp:2026-03-20 13:30:48.865021428 +0000 UTC m=+28.462953957,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.354766 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf3d634259\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3d634259 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,LastTimestamp:2026-03-20 13:30:49.095701028 +0000 UTC m=+28.693633557,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.360905 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf3e469596\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e469596 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,LastTimestamp:2026-03-20 13:30:49.107178 +0000 UTC m=+28.705110569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.366779 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:58.741974225 +0000 UTC m=+38.339906784,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.371219 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:58.742061577 +0000 UTC m=+38.339994146,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.377043 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:31:08.741933408 +0000 UTC m=+48.339865977,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.967614 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.967844 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.968971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969587 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.969819 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:27 crc kubenswrapper[4755]: I0320 13:31:27.170274 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.167982 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.742538 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.742630 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.746349 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.746563 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.749367 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:28 crc kubenswrapper[4755]: E0320 13:31:28.749705 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:29 crc kubenswrapper[4755]: I0320 13:31:29.166929 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:30 crc kubenswrapper[4755]: I0320 13:31:30.171385 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.170407 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.171251 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.191366 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193388 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193483 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.200680 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.316598 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:31 crc kubenswrapper[4755]: W0320 13:31:31.451161 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.451254 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:31:32 crc kubenswrapper[4755]: I0320 13:31:32.170953 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:32 crc kubenswrapper[4755]: W0320 13:31:32.827737 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:31:32 crc kubenswrapper[4755]: E0320 13:31:32.827813 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.167214 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.333689 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.358075 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:31:34 crc kubenswrapper[4755]: I0320 13:31:34.173548 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.170254 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.749692 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.749961 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.757417 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.168241 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.596108 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.897541 4755 csr.go:261] certificate signing request csr-xt5nl is approved, waiting to be issued Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.909006 4755 csr.go:257] certificate signing request csr-xt5nl is issued Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.954540 4755 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.979475 4755 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:31:37 crc kubenswrapper[4755]: I0320 13:31:37.911286 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 07:34:26.827083117 +0000 UTC Mar 20 13:31:37 crc kubenswrapper[4755]: I0320 13:31:37.911351 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6786h2m48.915736057s for next certificate rotation Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.201899 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.204149 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.216520 4755 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.217016 4755 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.217060 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222638 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222719 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.243145 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253696 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.269966 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281562 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281588 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.298319 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311288 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328562 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328832 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328877 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.429182 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.529780 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.630748 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.731196 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.832086 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.933093 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.033479 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.134442 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.235537 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.336295 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.436722 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.537744 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.638199 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.738801 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.840008 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.941138 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.042247 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.143235 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.225061 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.244421 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.345506 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.446486 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.546897 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.647747 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.106450 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.206634 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.307105 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.317128 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.408049 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.508999 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.609232 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.710264 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.810757 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.911800 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.012591 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.113629 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.214264 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.225039 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226817 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.227891 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.228196 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.314922 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.360934 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.415495 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.516373 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.617330 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.718344 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.819040 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.919257 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.020052 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.120672 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.221343 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.322294 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.423227 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.524145 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.624513 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.724767 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.825775 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.926469 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.027600 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.128341 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.229260 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.329634 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.430738 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.531721 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.632332 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.732969 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.833139 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.933947 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.034995 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.135853 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.236800 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.337552 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.438521 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.539697 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.639798 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.740884 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.841725 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.942728 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.043889 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.144065 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.244957 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.346097 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.382914 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448935 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552890 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.655946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.655996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758896 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862939 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966815 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069518 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172324 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.180636 4755 apiserver.go:52] "Watching apiserver" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.186082 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.186506 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187011 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187063 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187235 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.187906 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.188128 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.188201 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.190027 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191592 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191670 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191815 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.194225 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.194371 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.195018 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.228405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.251383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.262216 4755 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.267295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275316 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.280492 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.295409 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.309336 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.326477 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.336585 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352317 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352386 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352441 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352469 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352566 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352588 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352617 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352641 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352734 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352723 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352756 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352822 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352874 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352904 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352928 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353075 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353130 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353199 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353218 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353243 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353289 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353386 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353560 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353708 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353715 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353742 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353775 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353872 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353949 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353952 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353993 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354087 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354138 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354169 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354222 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354240 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354303 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354323 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354356 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354501 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354576 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354611 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354722 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354760 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354893 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354931 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354998 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355057 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355073 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355091 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355108 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355159 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355194 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355213 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355233 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355255 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355346 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355453 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355509 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355554 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356564 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356910 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357032 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357184 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357220 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357257 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357290 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357370 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357449 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357490 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357884 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357959 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358004 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358127 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358163 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358196 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358290 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358631 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358769 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358882 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358913 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358944 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359116 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359157 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359196 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359264 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359335 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359371 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359447 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361090 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361145 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362217 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362244 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362343 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362433 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362738 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362816 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362833 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362851 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362867 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362905 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362927 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362954 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362972 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362992 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363015 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363034 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363054 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363076 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354115 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354418 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354446 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354505 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354852 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355471 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355617 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357178 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357730 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357910 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359538 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360096 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366614 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360163 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360322 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360757 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361290 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361980 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362508 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363740 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364009 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364512 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364998 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365172 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365788 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365903 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366111 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366279 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367474 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367926 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367936 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368016 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368353 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369389 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369481 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.369832 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369984 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370362 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370360 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.370536 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.870500899 +0000 UTC m=+87.468433618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.372187 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.872169886 +0000 UTC m=+87.470102645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370926 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371108 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371294 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371359 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371438 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371460 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371673 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.372564 4755 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.373025 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.373107 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.873085171 +0000 UTC m=+87.471017710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.373523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.373600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387433 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387476 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387499 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387586 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.887558661 +0000 UTC m=+87.485491380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390189 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391136 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391245 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.392807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.392892 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393230 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393255 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393718 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393827 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393866 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393881 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.394334 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.394380 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.893918197 +0000 UTC m=+87.491850936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.394556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395342 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395475 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395741 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.396174 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.397358 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.398754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.398965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.399014 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.403873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407870 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408106 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408293 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408343 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.409684 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410220 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410570 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411501 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412275 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412330 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413212 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413310 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414252 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414309 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414434 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415373 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415466 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415510 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415589 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415817 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416033 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416245 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416440 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416700 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416821 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416875 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417110 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417125 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418756 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418900 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419178 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419235 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.420590 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.420695 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.421143 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.421620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.427529 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.434882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.441364 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.444712 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.454518 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464329 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464353 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464398 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464513 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464924 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465158 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465257 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465380 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465498 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465852 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465971 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466091 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466210 4755 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466304 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466392 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466474 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466549 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466633 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466746 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466838 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466930 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467014 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467097 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467180 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467280 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467365 4755 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467451 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467574 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467675 4755 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467756 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467847 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467934 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468016 4755 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468094 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468175 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468250 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468332 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468532 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468629 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468758 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468852 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468934 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469019 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469109 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469205 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469290 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469366 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469445 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469548 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469630 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469748 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469832 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469910 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469984 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470082 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470190 4755 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470295 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470392 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470480 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470566 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470648 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470775 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470877 4755 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470988 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471097 4755 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471219 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471332 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471444 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471587 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471730 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471862 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471982 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472091 4755 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472179 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472260 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472335 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472424 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472501 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472581 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472704 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472814 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472934 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473061 4755 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473151 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473228 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473321 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473498 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473588 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473695 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473776 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473850 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473924 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473997 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474091 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474170 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474252 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474326 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474551 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474636 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474750 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474834 4755 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475162 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475241 4755 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475323 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475433 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475541 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475628 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475736 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475771 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475790 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475804 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475819 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475835 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475850 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475863 4755 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475878 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475891 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475905 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475917 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475930 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475943 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475956 4755 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475968 4755 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475985 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476002 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476017 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476032 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476045 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476058 4755 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476092 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476106 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476121 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476136 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476150 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476175 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476188 4755 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476200 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476213 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476226 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476238 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476251 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476264 4755 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476277 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476291 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476304 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476316 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476328 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476341 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476353 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476366 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476379 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476400 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476414 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476426 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476438 4755 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476450 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476462 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476475 4755 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476490 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476502 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476515 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476527 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476542 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476556 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476571 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476584 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476599 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476612 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476624 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476640 4755 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476677 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476689 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476703 4755 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476716 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476727 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476740 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476752 4755 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476763 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476775 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476787 4755 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494580 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.508376 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.527573 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.532717 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.536027 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.536311 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: W0320 13:31:47.538583 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd WatchSource:0}: Error finding container 2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd: Status 404 returned error can't find the container with id 2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.542363 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:31:47 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:31:47 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:31:47 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:31:47 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:31:47 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:31:47 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:31:47 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:31:47 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.547899 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:31:47 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.549715 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.551177 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:31:47 crc kubenswrapper[4755]: else Mar 20 13:31:47 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:31:47 crc kubenswrapper[4755]: exit 1 Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.552338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.630885 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ad044424dec10a9b85e73f38d92753d33328812ddc6316e8013c584622dfca9b"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.632431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.633916 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8c6f52fc6626fb0935c8fd231551aaddd00cb052a5ed9129d1ae96ea000b0a56"} Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.634276 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:31:47 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:31:47 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:31:47 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:31:47 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:31:47 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:31:47 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:31:47 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:31:47 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.636040 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.636242 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:31:47 crc kubenswrapper[4755]: else Mar 20 13:31:47 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:31:47 crc kubenswrapper[4755]: exit 1 Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637320 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637421 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637484 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:31:47 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.639098 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.648253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.661640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.677626 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.693594 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701957 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.705880 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.717345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.730207 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.743091 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.756762 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.773547 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.789631 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805148 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.812227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880542 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880632 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.880603309 +0000 UTC m=+88.478535878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880711 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.880688932 +0000 UTC m=+88.478621641 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880933 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.881007 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.88098827 +0000 UTC m=+88.478920839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908201 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908215 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.982198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.982307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982531 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982571 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982597 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982645 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982716 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982736 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982755 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.982722807 +0000 UTC m=+88.580655376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982824 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.982792078 +0000 UTC m=+88.580724617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011560 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114355 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114367 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216794 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216819 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216829 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.224953 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.225043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319757 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.423008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.423025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525879 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.543939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.559462 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.576263 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582392 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.599794 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604860 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.621379 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.641530 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.641708 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644490 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644543 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747368 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747459 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850879 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850890 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892062 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892100 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892141 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.892096268 +0000 UTC m=+90.490028827 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892208 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.89218332 +0000 UTC m=+90.490115849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892236 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.892227671 +0000 UTC m=+90.490160200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953233 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.993139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.993253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993495 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993557 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993594 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993552 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993646 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993701 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993742 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.99371271 +0000 UTC m=+90.591645269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993797 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.993772982 +0000 UTC m=+90.591705511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055890 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.056017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158956 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.225253 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.225353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:49 crc kubenswrapper[4755]: E0320 13:31:49.225583 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:49 crc kubenswrapper[4755]: E0320 13:31:49.225743 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.229792 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.230342 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.231765 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.232400 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.233430 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.233932 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.234564 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.235555 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.236295 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.237386 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.237961 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.239084 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.239578 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.240172 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.241089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.241604 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.242580 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.242992 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.243572 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.244592 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.245068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.246020 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.246460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.247461 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.247925 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.248576 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.249666 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.250129 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.251169 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.251606 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.252493 4755 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.252613 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.254218 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.255090 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.255549 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.257102 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.257876 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.258797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.259448 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.260871 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.261380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.261985 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262107 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.263166 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.264125 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.264591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.265495 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.266087 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.267135 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.267627 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.268441 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.268965 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.269880 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.270442 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.271068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365222 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468844 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468888 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572260 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675794 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779104 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779295 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.882946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.882990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986399 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089527 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192821 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192835 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.224538 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.224685 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295925 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399561 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502713 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608225 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608269 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712552 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815455 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911049 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911128 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911165 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911299 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911301 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911239359 +0000 UTC m=+94.509171888 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911298 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911364 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911347652 +0000 UTC m=+94.509280181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911384 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911377623 +0000 UTC m=+94.509310152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918084 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.012412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.012469 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012542 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012568 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012569 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012581 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012585 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012597 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012638 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:55.012622225 +0000 UTC m=+94.610554755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012669 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:55.012663697 +0000 UTC m=+94.610596226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021876 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124967 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124979 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.225488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.225506 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.225803 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.225941 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227392 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.237561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.249278 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.262888 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.276330 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.289441 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.301050 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330989 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434735 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434801 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434871 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538274 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538286 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640640 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742968 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846581 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846600 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846612 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950207 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053419 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156465 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.225355 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:52 crc kubenswrapper[4755]: E0320 13:31:52.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259252 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464829 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568213 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671481 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671524 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775347 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878984 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.981955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982103 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085296 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188627 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.225478 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.225561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:53 crc kubenswrapper[4755]: E0320 13:31:53.225706 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:53 crc kubenswrapper[4755]: E0320 13:31:53.225833 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.291930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.291986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292040 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.394958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395126 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498222 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601431 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704600 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808728 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911954 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221287 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221343 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.225574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.225773 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.243905 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.244005 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.244268 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324307 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324455 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428446 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532352 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.635143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.635188 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.653043 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.653199 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.740959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741068 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844163 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844197 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950789 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950819 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955052 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955218 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955321 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955353 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955315 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955273882 +0000 UTC m=+102.553206411 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955467 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955420576 +0000 UTC m=+102.553353205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955514 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955493638 +0000 UTC m=+102.553426397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054151 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.056561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.056637 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056877 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056954 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056881 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056986 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057017 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057042 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057099 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:03.057069239 +0000 UTC m=+102.655001948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057138 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:03.0571216 +0000 UTC m=+102.655054389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158200 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.225431 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.225433 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.225598 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.225700 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363570 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466952 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570179 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673766 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776244 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879680 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087172 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087263 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190775 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.225220 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:56 crc kubenswrapper[4755]: E0320 13:31:56.225440 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292993 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.293012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.293025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395105 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.497940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.497999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498036 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498051 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.601006 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704185 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911530 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014398 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117576 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220592 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220621 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.224873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.224887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:57 crc kubenswrapper[4755]: E0320 13:31:57.225169 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:57 crc kubenswrapper[4755]: E0320 13:31:57.225487 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324627 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531848 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634689 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738064 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738225 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842412 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050184 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.153995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154107 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.224744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.224936 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257266 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360710 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464398 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464441 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567588 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.670982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671064 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671101 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833584 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833727 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.846705 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851858 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851977 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.872939 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878611 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.896465 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902996 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.921806 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927301 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.943565 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.943787 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048945 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151675 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.225727 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.225758 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:59 crc kubenswrapper[4755]: E0320 13:31:59.225947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:59 crc kubenswrapper[4755]: E0320 13:31:59.226000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253920 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357578 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357767 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462909 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669544 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773219 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773302 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875965 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.876170 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.979926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980234 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084397 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.224838 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:00 crc kubenswrapper[4755]: E0320 13:32:00.225021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290592 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290714 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497863 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704365 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808491 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912419 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014823 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014927 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.117952 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118083 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221997 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.225295 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.225349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:01 crc kubenswrapper[4755]: E0320 13:32:01.225916 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:01 crc kubenswrapper[4755]: E0320 13:32:01.225458 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.246727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.262382 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.278447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.294491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.307562 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.320865 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324739 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.337848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428437 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532104 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532151 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635682 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739456 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.842954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843133 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946220 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.048969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049091 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.152998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153155 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.225456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.226000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228100 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:32:02 crc kubenswrapper[4755]: else Mar 20 13:32:02 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:32:02 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228290 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228618 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:02 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:32:02 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:32:02 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:32:02 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:32:02 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:32:02 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:32:02 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:32:02 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:32:02 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:32:02 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:32:02 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.229788 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.229840 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.230822 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:02 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: Mar 20 13:32:02 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:32:02 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.232048 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.257945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258105 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361776 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361821 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464942 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568202 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672487 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776181 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879239 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.982940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983019 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983088 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.039946 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.040092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.040180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040260 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040224509 +0000 UTC m=+118.638157048 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040305 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040313 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040394 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040372133 +0000 UTC m=+118.638304692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040429 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040407954 +0000 UTC m=+118.638340523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086642 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.141560 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.141751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.141953 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142000 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142018 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142037 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142082 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142040 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142160 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.14213409 +0000 UTC m=+118.740066649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142305 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.142264583 +0000 UTC m=+118.740197162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.193348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.224918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.225038 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.225122 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.225260 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297943 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297964 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401539 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504638 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.607999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711295 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.814848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815104 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918404 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021416 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.225078 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:04 crc kubenswrapper[4755]: E0320 13:32:04.225331 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227869 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330597 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330810 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434255 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641529 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744402 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848242 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951858 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054846 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158618 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.225599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.225778 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:05 crc kubenswrapper[4755]: E0320 13:32:05.225985 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.226078 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:32:05 crc kubenswrapper[4755]: E0320 13:32:05.226184 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262367 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262478 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468515 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.575840 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576845 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680619 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680751 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.685619 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.688162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.688634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.708810 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.723077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.739499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.762742 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.780472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783883 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.784011 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.798324 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.813011 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887642 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991186 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.007154 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zf67p"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.007864 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.010513 4755 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.010569 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.010607 4755 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.010733 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.013265 4755 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.013329 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.030273 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.040292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.053997 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.067052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.093994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.114253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.139789 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.159476 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.171441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.171472 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.172362 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197230 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.225258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.225455 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299490 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.399185 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dmzsb"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.400300 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402301 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402605 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8btvn"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403003 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403075 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xmn6s"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403324 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403570 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.405118 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.405252 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.406067 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.406283 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410141 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410693 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410791 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410976 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412504 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412502 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.423258 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.439578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.444827 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.455831 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.465792 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.479215 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.491345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506385 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506399 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.507421 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.523465 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.538607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.552317 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.565030 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577083 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577142 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577936 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578102 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578327 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578449 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578606 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578724 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578785 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578945 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579270 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579434 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579607 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.581333 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.597257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609932 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.614062 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.627708 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.642891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.665785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681202 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681395 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681442 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681455 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681863 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682020 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682839 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682906 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683929 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.686694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.693097 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.702053 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.709684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715102 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715535 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.717313 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.718139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.727339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.739206 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.742725 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa13631f_58da_4411_8e94_2385741a977e.slice/crio-bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f WatchSource:0}: Error finding container bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f: Status 404 returned error can't find the container with id bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.746791 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lctxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dmzsb_openshift-multus(aa13631f-58da-4411-8e94-2385741a977e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.748075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podUID="aa13631f-58da-4411-8e94-2385741a977e" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.750018 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ba4f17_8c41_4124_b563_01d5f1751139.slice/crio-3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398 WatchSource:0}: Error finding container 3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398: Status 404 returned error can't find the container with id 3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398 Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.750298 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.753808 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:06 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 13:32:06 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 13:32:06 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9w8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:06 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.755000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.771339 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.774771 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.776072 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.792706 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.793919 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.799035 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800674 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800722 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800798 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800959 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.802104 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.804338 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.813861 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818865 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.829580 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.844238 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.860886 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.877047 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884801 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884938 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884960 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.885022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.885045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.891909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.906778 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921783 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.924273 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.942559 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.960334 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.984313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986708 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986916 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987006 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987313 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987371 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988165 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.989004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.989082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.993119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.995892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.006119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025597 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.118732 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129169 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.137364 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 13:32:07 crc kubenswrapper[4755]: apiVersion: v1 Mar 20 13:32:07 crc kubenswrapper[4755]: clusters: Mar 20 13:32:07 crc kubenswrapper[4755]: - cluster: Mar 20 13:32:07 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 13:32:07 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: contexts: Mar 20 13:32:07 crc kubenswrapper[4755]: - context: Mar 20 13:32:07 crc kubenswrapper[4755]: cluster: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: namespace: default Mar 20 13:32:07 crc kubenswrapper[4755]: user: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: current-context: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: kind: Config Mar 20 13:32:07 crc kubenswrapper[4755]: preferences: {} Mar 20 13:32:07 crc kubenswrapper[4755]: users: Mar 20 13:32:07 crc kubenswrapper[4755]: - name: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: user: Mar 20 13:32:07 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: EOF Mar 20 13:32:07 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlq8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.139489 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.225743 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.226074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.226303 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.226549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231552 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.244350 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.287249 4755 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.326236 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334595 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334778 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.425600 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.425600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.428037 4755 projected.go:194] Error preparing data for projected volume kube-api-access-h5qsm for pod openshift-dns/node-resolver-zf67p: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.428170 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm podName:378696d3-72aa-4101-9746-a2b0d203f525 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:07.928137109 +0000 UTC m=+107.526069668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h5qsm" (UniqueName: "kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm") pod "node-resolver-zf67p" (UID: "378696d3-72aa-4101-9746-a2b0d203f525") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438101 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540915 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.541048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.541067 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644229 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.701962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.704153 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lctxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dmzsb_openshift-multus(aa13631f-58da-4411-8e94-2385741a977e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.704465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"c5dfe5e0ba9e4e073084c039346a869cdace2560fac63c02b23de7cad0ed5e4a"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.706326 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podUID="aa13631f-58da-4411-8e94-2385741a977e" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.706366 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 13:32:07 crc kubenswrapper[4755]: apiVersion: v1 Mar 20 13:32:07 crc kubenswrapper[4755]: clusters: Mar 20 13:32:07 crc kubenswrapper[4755]: - cluster: Mar 20 13:32:07 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 13:32:07 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: contexts: Mar 20 13:32:07 crc kubenswrapper[4755]: - context: Mar 20 13:32:07 crc kubenswrapper[4755]: cluster: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: namespace: default Mar 20 13:32:07 crc kubenswrapper[4755]: user: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: current-context: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: kind: Config Mar 20 13:32:07 crc kubenswrapper[4755]: preferences: {} Mar 20 13:32:07 crc kubenswrapper[4755]: users: Mar 20 13:32:07 crc kubenswrapper[4755]: - name: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: user: Mar 20 13:32:07 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: EOF Mar 20 13:32:07 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlq8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.707101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"a7d0785ead0f22a1ed3bc834bf588ea6bac1d78135d33174225637d2a0f4afac"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.707677 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.708564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.709090 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.711944 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.712124 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 13:32:07 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 13:32:07 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9w8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.714191 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.714243 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.716365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.736537 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746847 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.747741 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.757909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.768360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.778406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.790960 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.817301 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.833454 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.847460 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849582 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849598 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.863017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.875746 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.891810 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.908023 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.921957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.935807 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.955314 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.973954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.986550 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.997204 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.999055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.004793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.011188 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.024787 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.040297 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.056929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057625 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.085764 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.100432 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.128739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.156560 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:08 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:08 crc kubenswrapper[4755]: set -uo pipefail Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 13:32:08 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 20 13:32:08 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 20 13:32:08 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 20 13:32:08 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: while true; do Mar 20 13:32:08 crc kubenswrapper[4755]: declare -A svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 13:32:08 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 13:32:08 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 13:32:08 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 13:32:08 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 13:32:08 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 20 13:32:08 crc kubenswrapper[4755]: do Mar 20 13:32:08 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 20 13:32:08 crc kubenswrapper[4755]: break Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 20 13:32:08 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 13:32:08 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 13:32:08 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Append resolver entries for services Mar 20 13:32:08 crc kubenswrapper[4755]: rc=0 Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 20 13:32:08 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 13:32:08 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 20 13:32:08 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 13:32:08 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: unset svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5qsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zf67p_openshift-dns(378696d3-72aa-4101-9746-a2b0d203f525): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:08 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.158127 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zf67p" podUID="378696d3-72aa-4101-9746-a2b0d203f525" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.159990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160124 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.224978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.225198 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263960 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366767 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471281 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574814 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574850 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678611 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678821 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.714390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf67p" event={"ID":"378696d3-72aa-4101-9746-a2b0d203f525","Type":"ContainerStarted","Data":"a6a1240d78c23b729861be2ba99389d3cedd2920dff26b255d3ba2e8d5584aba"} Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.717038 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:08 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:08 crc kubenswrapper[4755]: set -uo pipefail Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 13:32:08 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 20 13:32:08 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 20 13:32:08 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 20 13:32:08 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: while true; do Mar 20 13:32:08 crc kubenswrapper[4755]: declare -A svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 13:32:08 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 13:32:08 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 13:32:08 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 13:32:08 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 13:32:08 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 20 13:32:08 crc kubenswrapper[4755]: do Mar 20 13:32:08 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 20 13:32:08 crc kubenswrapper[4755]: break Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 20 13:32:08 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 13:32:08 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 13:32:08 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Append resolver entries for services Mar 20 13:32:08 crc kubenswrapper[4755]: rc=0 Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 20 13:32:08 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 13:32:08 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 20 13:32:08 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 13:32:08 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: unset svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5qsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zf67p_openshift-dns(378696d3-72aa-4101-9746-a2b0d203f525): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:08 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.718451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zf67p" podUID="378696d3-72aa-4101-9746-a2b0d203f525" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.729089 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.753269 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.764417 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.773640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.785542 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.796957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.810491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.825829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.842945 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.856910 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.874614 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.892444 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.903259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988907 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988980 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025496 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.037698 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042385 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.056522 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061825 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.078811 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083629 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083694 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.099291 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104705 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.119179 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.119611 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.122015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.122035 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.224552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.224617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.224730 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.224904 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225579 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.328362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.328867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329341 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.431640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.431991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432253 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639259 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.743090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.743230 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846972 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950406 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.053684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054261 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.158529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.224911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:10 crc kubenswrapper[4755]: E0320 13:32:10.225126 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262900 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262970 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.366016 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574382 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574402 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678511 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.783380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.783609 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.887862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888976 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992883 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.095909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.095982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096048 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199862 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.225572 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.225881 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:11 crc kubenswrapper[4755]: E0320 13:32:11.226021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:11 crc kubenswrapper[4755]: E0320 13:32:11.225851 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.246013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.274129 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.293067 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303704 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.312274 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.327306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.343578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.353925 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.380214 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.392073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407236 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.409540 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.423327 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.435795 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.449234 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510513 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510533 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613021 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717347 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820695 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820708 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923812 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027586 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131463 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131565 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.225257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:12 crc kubenswrapper[4755]: E0320 13:32:12.225510 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441234 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544484 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647556 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.686921 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b9bt2"] Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.687490 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691890 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.692062 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691906 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.724588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.739786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751332 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.758262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.771202 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.781914 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.797864 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.812389 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.828947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.849394 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852451 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852552 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855548 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.880991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.902064 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.922317 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.938744 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.952848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.957836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959741 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.985187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.010761 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.032932 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:13 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 13:32:13 crc kubenswrapper[4755]: while [ true ]; Mar 20 13:32:13 crc kubenswrapper[4755]: do Mar 20 13:32:13 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $f Mar 20 13:32:13 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: else Mar 20 13:32:13 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 20 13:32:13 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $d Mar 20 13:32:13 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4m4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-b9bt2_openshift-image-registry(741d8a76-423b-4e13-aedb-fff0e87a207c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:13 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.034629 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-b9bt2" podUID="741d8a76-423b-4e13-aedb-fff0e87a207c" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.062951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063112 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.224917 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.225049 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.225144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.225267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269948 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373331 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476758 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682559 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.733953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9bt2" event={"ID":"741d8a76-423b-4e13-aedb-fff0e87a207c","Type":"ContainerStarted","Data":"8b589ffb4c4049b586dc6be51e4e22a1e3a85b7d86d156bc885232694332d18a"} Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.737309 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:13 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 13:32:13 crc kubenswrapper[4755]: while [ true ]; Mar 20 13:32:13 crc kubenswrapper[4755]: do Mar 20 13:32:13 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $f Mar 20 13:32:13 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: else Mar 20 13:32:13 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 20 13:32:13 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $d Mar 20 13:32:13 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4m4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-b9bt2_openshift-image-registry(741d8a76-423b-4e13-aedb-fff0e87a207c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:13 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.738863 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-b9bt2" podUID="741d8a76-423b-4e13-aedb-fff0e87a207c" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.745472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.756217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.773644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.785877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787673 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.798965 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.818707 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.837004 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.852911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.872218 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.885004 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.890983 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891510 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.899220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.918801 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.938999 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.953367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.996982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997434 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203585 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.224874 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:14 crc kubenswrapper[4755]: E0320 13:32:14.225043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307443 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.411002 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514810 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618757 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722560 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722702 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826385 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826449 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826524 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929993 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033381 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.135997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136100 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136117 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.225515 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.225625 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:15 crc kubenswrapper[4755]: E0320 13:32:15.225747 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:15 crc kubenswrapper[4755]: E0320 13:32:15.225835 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239745 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342851 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342996 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447630 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551275 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654944 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759400 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965564 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174170 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.224762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.225114 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.227793 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:16 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:16 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:16 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:32:16 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:32:16 crc kubenswrapper[4755]: else Mar 20 13:32:16 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:32:16 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:16 crc kubenswrapper[4755]: fi Mar 20 13:32:16 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:32:16 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:16 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.229491 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277346 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485328 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485346 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588481 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691514 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898745 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898796 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002977 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105924 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.209004 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.209020 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.225349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.225374 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.225550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.226022 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.227256 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.227771 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:17 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:17 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:17 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:17 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: fi Mar 20 13:32:17 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:32:17 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:32:17 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:32:17 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:32:17 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:32:17 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:32:17 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:32:17 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:32:17 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:32:17 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:32:17 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:32:17 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:17 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:17 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.228367 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.229803 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:17 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:17 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:17 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:17 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: fi Mar 20 13:32:17 crc kubenswrapper[4755]: Mar 20 13:32:17 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:32:17 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:32:17 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:17 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:17 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.231051 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312745 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312955 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.415941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416858 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520289 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.623701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624890 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728593 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.832006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.832028 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936530 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.041001 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.146154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.224907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:18 crc kubenswrapper[4755]: E0320 13:32:18.225135 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249498 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352414 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352431 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.548104 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn"] Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.548869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.551508 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.553315 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.559017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.568272 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.583302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.597885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.616386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.619999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620192 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.627891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.640379 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.651855 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.661947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662066 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662080 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.666132 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.682560 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.696405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.713941 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721981 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.723011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.723329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.731527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.742181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.743252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.753041 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.758510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.772089 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774379 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.785452 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.798848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.813607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.826610 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.840010 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.853439 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.862370 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.865456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.874588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878275 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878292 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.897884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.912502 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.930211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.941112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.951919 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.971186 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981329 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981565 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.007366 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.018526 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084781 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125714 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125799 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.125980 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126037 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.126020508 +0000 UTC m=+150.723953057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126107 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.12609583 +0000 UTC m=+150.724028359 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126196 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126290 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.126269245 +0000 UTC m=+150.724201774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188900 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.225074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.225127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.225307 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226152 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.226308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.226394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226495 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226535 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226556 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226595 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226624 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226634 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.226607032 +0000 UTC m=+150.824539731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226645 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226790 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.226766027 +0000 UTC m=+150.824698596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.279555 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.282093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.282170 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295280 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295349 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295362 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.307336 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.319235 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.328014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.328265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.336979 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.351191 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.363407 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.380696 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.389762 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398126 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.402991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.422755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.430555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.430642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.430742 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.430813 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.930790993 +0000 UTC m=+119.528723522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.441214 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450755 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.455130 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.464693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.466070 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472547 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.481203 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.481510 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485959 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.495850 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500361 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.507246 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.516368 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.518011 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522948 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.531960 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.534027 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.534182 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544739 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544812 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648284 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751217 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753558 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"9197c20e8553f6422e715ce60f4db4d61ef6e1e55385abe17b9eb996f3107c3b"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.756580 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.756981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.757900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.772265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.792499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.803048 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.818442 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.832037 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.844698 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854150 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855088 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.864054 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.880741 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.890700 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.907360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.918460 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.931446 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.938261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.938469 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.938548 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:20.938530219 +0000 UTC m=+120.536462768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.943383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.958277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.972758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.983483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.994735 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.007645 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.020395 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.030780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.039295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.046748 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.057041 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.060917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.060992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061556 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.070775 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.083107 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.098852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.111930 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.125564 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.134211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.150767 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.159625 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164390 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.225432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.225624 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267098 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371246 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371401 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578582 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578602 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683910 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683934 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.764771 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca" exitCode=0 Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.764886 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.777758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.790239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.805512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.815081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.824952 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.840842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.862000 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.874938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.886005 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891646 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.905916 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.917873 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.930139 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.946687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.950704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.950908 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.951005 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:22.950979824 +0000 UTC m=+122.548912433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.958383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.982419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.996576 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999442 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103270 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:21Z","lastTransitionTime":"2026-03-20T13:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.203778 4755 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225108 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225202 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225350 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225722 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225959 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.239535 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.249483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.279915 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.293432 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.305357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.321423 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.332845 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.339985 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.351158 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.370873 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.388371 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.403049 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.417571 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.432545 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.462586 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.478525 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.497021 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.772349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.779011 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130" exitCode=0 Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.779083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.798529 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.823958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.857016 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.871786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.889756 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.906963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.923527 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.933309 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.964084 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.973947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.987165 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.004291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.012107 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.021080 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.030484 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.039122 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.047581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.068415 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.081912 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.094474 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.106588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.120548 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.131932 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.142579 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.151454 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.164999 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.178725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.194113 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.233241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.233797 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.234406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.255733 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.292752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.325933 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.358856 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.786877 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c" exitCode=0 Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.786961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c"} Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.791564 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" exitCode=0 Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.791705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.823834 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.841882 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.858486 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.871303 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.883173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.896323 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.910081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.924780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.936954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.950834 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.969860 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.973407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.973626 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.973768 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:26.973743215 +0000 UTC m=+126.571675834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.987170 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.995504 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.007784 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.021295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.033242 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.043456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.083687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.119048 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.157684 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.198341 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.224947 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.225041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.224966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225112 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225211 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225322 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.240972 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.279920 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.315761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.360740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.400564 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.443532 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.477287 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.519365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.560470 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.598017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.643837 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.680447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.718115 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.802876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf67p" event={"ID":"378696d3-72aa-4101-9746-a2b0d203f525","Type":"ContainerStarted","Data":"4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808348 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.812875 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869" exitCode=0 Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.812947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.823752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.839510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.859136 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.878153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.926306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.961949 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.003900 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.038959 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.082768 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.118770 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.157512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.208403 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.225308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:24 crc kubenswrapper[4755]: E0320 13:32:24.225452 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.238914 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.282915 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.319315 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.363520 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.401980 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.442944 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.481211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.526286 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.576821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.600250 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.643242 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.682838 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.717945 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.775486 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.801799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.824225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122"} Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.842410 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.878813 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.922596 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.960182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.003252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.045173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.081678 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.117152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.177152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.201950 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225340 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225448 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225589 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225712 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225955 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.238121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.277777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.317418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.360554 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.404984 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.441568 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.482989 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.530557 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.573757 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.603223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.642894 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.685514 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.720384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.760607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.838787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.843023 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122" exitCode=0 Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.843084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122"} Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.868225 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.878847 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.890800 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.918377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.959163 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.997508 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.038937 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.078938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.119315 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.159366 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.211817 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.225040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:26 crc kubenswrapper[4755]: E0320 13:32:26.225306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.239517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.284821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.319239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: E0320 13:32:26.334290 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.360983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.397970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.446187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.856335 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3" exitCode=0 Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.856519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3"} Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.860142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9bt2" event={"ID":"741d8a76-423b-4e13-aedb-fff0e87a207c","Type":"ContainerStarted","Data":"3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a"} Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.872344 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.891295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.908958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.924693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.943285 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.964622 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.989479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.001404 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.020520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.020786 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.020939 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:35.020907794 +0000 UTC m=+134.618840353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.034319 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.056599 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.068216 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.080136 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.095262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.109413 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.121510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.133752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.141593 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.162953 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.199885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225080 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225398 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225443 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225633 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225732 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.241557 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.280597 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.318616 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.359603 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.401052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.442162 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.477928 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.521971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.564550 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.611062 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.639259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.683870 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.723465 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.762531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.802851 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.872387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf"} Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.888475 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.918954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.934517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.961137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.003966 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.045052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.079425 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.122307 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.162531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.208750 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.225258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:28 crc kubenswrapper[4755]: E0320 13:32:28.225434 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.246211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.281778 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.320790 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.370850 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.399361 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.444784 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.484158 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.886568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95"} Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.887152 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.887226 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.910494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.925280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.940280 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.955701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.968098 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.988257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.003572 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.022311 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.039306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.068623 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.078948 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.091383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.099611 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.109274 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.121619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.135946 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.147841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.162913 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.196498 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.225360 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.225726 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.226019 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.226225 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.226372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.226525 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.240881 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.282342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.319496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.360382 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.398516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.447384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.477354 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.530106 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.560679 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564564 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.579422 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585161 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.599142 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.599491 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.605059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.605099 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.619273 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624357 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.641842 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656559 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656617 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.669234 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.680830 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.681080 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.698285 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.717797 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.760558 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.798449 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.836355 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.893113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.894901 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.925176 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.948302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.959201 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.970353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.998709 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.041069 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.078930 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.121372 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.159711 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.216931 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.225790 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:30 crc kubenswrapper[4755]: E0320 13:32:30.226550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.237483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.278502 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.320151 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.363774 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.403149 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.441637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.480601 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.521176 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.899479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81"} Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.899577 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb"} Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.925221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.944208 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.966377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.988353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.005675 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.024604 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.048365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.063453 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.091604 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.105159 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.120358 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.134982 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.148702 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.162590 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.175140 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.190848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.207469 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.224982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.224993 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.225032 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.225648 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.225812 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.226055 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.245781 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.286738 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.323578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.334933 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.361728 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.401077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.439976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.479890 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.538340 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.566950 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.607793 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.640318 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.680120 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.721386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.773173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.802397 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.845470 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.879595 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.926938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.966831 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.997892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.042963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.080600 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.128142 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.160594 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.199124 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.224750 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:32 crc kubenswrapper[4755]: E0320 13:32:32.224891 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.245962 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.283312 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.322464 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.361953 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.401241 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.441864 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.483699 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.520871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.564984 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.911162 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.959366 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" exitCode=1 Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.959457 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95"} Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.960493 4755 scope.go:117] "RemoveContainer" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.995479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.009610 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.021247 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.034227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.051633 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.071077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.083921 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.099519 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.132243 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.148152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.167262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.184570 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.200740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.221467 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.224772 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.224910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.224972 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.225167 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.225399 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.225574 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.241935 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.258851 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.276179 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.967819 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.971513 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91"} Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.972184 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.994951 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.014487 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.031602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.047513 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.062018 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.079777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.099786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.113480 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.127102 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.141848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.154265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.168828 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.184435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.211991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.226043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:34 crc kubenswrapper[4755]: E0320 13:32:34.226238 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.247785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.259802 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.270956 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.979853 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.981974 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987218 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" exitCode=1 Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91"} Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987361 4755 scope.go:117] "RemoveContainer" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.988599 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:34 crc kubenswrapper[4755]: E0320 13:32:34.988978 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.011137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.021948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.022213 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.022314 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.022287488 +0000 UTC m=+150.620220057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.031215 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.066400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.086137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.106083 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.128471 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.146217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.161008 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.177540 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.199761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.225357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.225481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.225557 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.225745 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.226332 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.226494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.227121 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.249850 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.269218 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.296121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.333577 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.353871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.379209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.994505 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595"} Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.997600 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.003416 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.003671 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.032976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.050006 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.064343 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.080379 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.100025 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.119858 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.138937 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.160852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.179724 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.216905 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.224640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.224836 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.235066 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.261014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.283620 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.306799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.327058 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.337916 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.346573 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.366887 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.386479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.418740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.433034 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.447733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.463719 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.476971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.490386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.519206 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.545013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.562079 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.576530 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.597469 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.625213 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.640682 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.656346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.692644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.707407 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224684 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224738 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.224931 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.225149 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.225305 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:38 crc kubenswrapper[4755]: I0320 13:32:38.224894 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:38 crc kubenswrapper[4755]: E0320 13:32:38.225154 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225055 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.225895 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225171 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.225993 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.226281 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689243 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.716957 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723433 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.745721 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752706 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.777099 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783268 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.803360 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809163 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.832009 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.832253 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:40 crc kubenswrapper[4755]: I0320 13:32:40.225580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:40 crc kubenswrapper[4755]: E0320 13:32:40.225875 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225740 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226121 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226348 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226583 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.248435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.277829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.293911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.308293 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.325286 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.338375 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.343496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.363884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.378019 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.395587 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.411966 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.426426 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.445281 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.471296 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.488773 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.505039 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.526061 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.540223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:42 crc kubenswrapper[4755]: I0320 13:32:42.224631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:42 crc kubenswrapper[4755]: E0320 13:32:42.225254 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.224882 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.225110 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.225460 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.225568 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.226278 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.226484 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:44 crc kubenswrapper[4755]: I0320 13:32:44.224555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:44 crc kubenswrapper[4755]: E0320 13:32:44.224777 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225286 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225383 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225472 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225576 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225804 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:46 crc kubenswrapper[4755]: I0320 13:32:46.224864 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:46 crc kubenswrapper[4755]: E0320 13:32:46.225030 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:46 crc kubenswrapper[4755]: E0320 13:32:46.340887 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225285 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225439 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225553 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225637 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225756 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:48 crc kubenswrapper[4755]: I0320 13:32:48.224980 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:48 crc kubenswrapper[4755]: E0320 13:32:48.225162 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226125 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226240 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226385 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.227877 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.059557 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.063167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.063754 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.082116 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.099681 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.114525 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.129256 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.143112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.155422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162966 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.163010 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.168330 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.175951 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.179866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.179917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180147 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.182983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.192296 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195854 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.198318 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.208836 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.210491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212718 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212793 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.225184 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.225208 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.225478 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.228887 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233232 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233264 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.243734 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.249485 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.249609 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.262686 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.279112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.291998 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.316768 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.331405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.051238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.051433 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.051994 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:23.051963823 +0000 UTC m=+182.649896382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.070634 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.071573 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075767 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" exitCode=1 Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556"} Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075950 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.076828 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.077283 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.100475 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.143712 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.152394 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.152476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152714 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.152674815 +0000 UTC m=+214.750607334 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152792 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152936 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.152899231 +0000 UTC m=+214.750831800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.153534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.153761 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.153975 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.153953122 +0000 UTC m=+214.751885661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.165691 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.185821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.209281 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.225802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227337 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.226455 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227584 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.225825 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227761 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.228639 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.252269 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.254747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.254798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255003 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255030 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255045 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255103 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.255087755 +0000 UTC m=+214.853020294 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255120 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255153 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255172 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255236 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.255218429 +0000 UTC m=+214.853150978 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.269313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.287383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.303439 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.324153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.342191 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.344458 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.371098 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.385518 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.404037 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.420338 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.432404 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.445687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.468797 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.483255 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.503364 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.521861 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.540313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.555127 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.575947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.592342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.608419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.624164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.639561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.654435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.672346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.687641 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.709880 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.724694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.083452 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.089201 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:32:52 crc kubenswrapper[4755]: E0320 13:32:52.089462 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.105400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.140745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.161172 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.178472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.191816 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.209091 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.224324 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.224829 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:52 crc kubenswrapper[4755]: E0320 13:32:52.225004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.239168 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.252733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.273578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.293012 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.326431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.344909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.366808 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.382100 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.399456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.416494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225024 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.225578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.226138 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.226235 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:54 crc kubenswrapper[4755]: I0320 13:32:54.225685 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:54 crc kubenswrapper[4755]: E0320 13:32:54.225894 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.225421 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225497 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.225802 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225226 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.226274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:56 crc kubenswrapper[4755]: I0320 13:32:56.225013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:56 crc kubenswrapper[4755]: E0320 13:32:56.225167 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:56 crc kubenswrapper[4755]: E0320 13:32:56.343618 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.225374 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.226193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.225410 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226428 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226708 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226868 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:58 crc kubenswrapper[4755]: I0320 13:32:58.225640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:58 crc kubenswrapper[4755]: E0320 13:32:58.225900 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224723 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.224913 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.225301 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.241792 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.224943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.225179 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611741 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.631801 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.636963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637357 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637712 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.655826 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.661007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.661019 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.680846 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686859 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.705271 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.710998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.712177 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.733811 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.734086 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.224877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.224891 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.225150 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.225336 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.225900 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.226186 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.241963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.264335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.286983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.303477 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.329842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.345520 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.365320 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.383704 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.424745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.438881 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.466017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.482911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.497302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.513509 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.530074 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.545123 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.564073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.579731 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.596800 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.225544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:02 crc kubenswrapper[4755]: E0320 13:33:02.226084 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.227351 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:02 crc kubenswrapper[4755]: E0320 13:33:02.227644 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.244192 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225187 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226014 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226853 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:04 crc kubenswrapper[4755]: I0320 13:33:04.225145 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:04 crc kubenswrapper[4755]: E0320 13:33:04.225356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225603 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.225888 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.226033 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.226160 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:06 crc kubenswrapper[4755]: I0320 13:33:06.225429 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:06 crc kubenswrapper[4755]: E0320 13:33:06.225689 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:06 crc kubenswrapper[4755]: E0320 13:33:06.346904 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227608 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228209 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228446 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228615 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.225213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:08 crc kubenswrapper[4755]: E0320 13:33:08.225927 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446683 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446945 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" exitCode=1 Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446985 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545"} Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.447456 4755 scope.go:117] "RemoveContainer" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.464709 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.479164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.491429 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.517221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.531241 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.543335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.560181 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.572418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.587182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.600212 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.611081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.621342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.633944 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.647321 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.657835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.669381 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.682809 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.699976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.709078 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.225621 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.225973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.225939 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.226133 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.226139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.226239 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.457078 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.457202 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc"} Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.480252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.498560 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.515785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.539340 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.562813 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.581780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.599795 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.624798 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.650849 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.669355 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.690024 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.710440 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.742732 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.760755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.780408 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.797690 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.812715 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.847749 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.865217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.225255 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.225465 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805899 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.827277 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833577 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.856380 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861314 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861392 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.883677 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889441 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.902345 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.906955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.906995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907026 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907040 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.920522 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.920674 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.225533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.226218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226485 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.226584 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226938 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.252360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.278736 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.299936 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.324513 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.347579 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.352427 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.392075 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.417164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.440533 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.458935 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.476171 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.495326 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.514222 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.540790 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.557619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.576175 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.599220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.613802 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.636418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.650400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:12 crc kubenswrapper[4755]: I0320 13:33:12.225472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:12 crc kubenswrapper[4755]: E0320 13:33:12.226804 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225711 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225711 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.225986 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.226209 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.226381 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:14 crc kubenswrapper[4755]: I0320 13:33:14.225293 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:14 crc kubenswrapper[4755]: E0320 13:33:14.225579 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225253 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.225775 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.225976 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.226109 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.226170 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.487023 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.492800 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.498252 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.519849 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.545384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.562055 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.583552 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.599478 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.617377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.632613 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.650005 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.673581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.686359 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.700152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.722013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.743175 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.757257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.782179 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.796411 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.824078 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.846187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.882727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.225407 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.225682 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.349846 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.500286 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.501799 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506467 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" exitCode=1 Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506694 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.507998 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.508333 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.548292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.569235 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.583121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.595969 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.607727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.622927 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.638522 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.660539 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.680737 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.706008 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.737058 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.750870 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.775422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.795071 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.815209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.836353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.857069 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.871581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.891271 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.225513 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.225773 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.225924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.226740 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.226863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.227105 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.514544 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.520311 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.520578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.535777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.554628 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.567352 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.598552 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.618957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.637511 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.658835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.680345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.696970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.715101 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.735482 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.754128 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.774312 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.794725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.824186 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.857565 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.877516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.904071 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.923360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:18 crc kubenswrapper[4755]: I0320 13:33:18.225204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:18 crc kubenswrapper[4755]: E0320 13:33:18.225446 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225523 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.225815 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225832 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.226045 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.225896 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:20 crc kubenswrapper[4755]: I0320 13:33:20.225198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:20 crc kubenswrapper[4755]: E0320 13:33:20.225822 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085495 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.114215 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120586 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120631 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.143370 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.169177 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174717 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.196028 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201467 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.225217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.225313 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.225687 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.226165 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226333 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226996 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.227267 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.259519 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.278743 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.299427 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.316425 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.335181 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.350616 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.356169 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.375622 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.395534 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.416967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.444897 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.478877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.494938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.516412 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.540473 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.572077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.591877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.613847 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.632858 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.651049 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:22 crc kubenswrapper[4755]: I0320 13:33:22.225717 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:22 crc kubenswrapper[4755]: E0320 13:33:22.225925 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.147179 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.147409 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.148058 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:27.148028335 +0000 UTC m=+246.745960894 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.225723 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.225816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.226291 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.226473 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.227015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.227202 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:24 crc kubenswrapper[4755]: I0320 13:33:24.225206 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:24 crc kubenswrapper[4755]: E0320 13:33:24.225413 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225564 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225736 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.225866 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.226176 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.226390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:26 crc kubenswrapper[4755]: I0320 13:33:26.225132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:26 crc kubenswrapper[4755]: E0320 13:33:26.225345 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:26 crc kubenswrapper[4755]: E0320 13:33:26.352143 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.225868 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.226025 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.226270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226458 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226619 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226822 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:28 crc kubenswrapper[4755]: I0320 13:33:28.224962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:28 crc kubenswrapper[4755]: E0320 13:33:28.225325 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.225182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.225412 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.225806 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.225945 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.226090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.226407 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:30 crc kubenswrapper[4755]: I0320 13:33:30.225297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:30 crc kubenswrapper[4755]: E0320 13:33:30.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225147 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225548 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225620 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225960 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.253841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.272614 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.292364 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.324492 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.350647 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.353459 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.373733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.392109 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.409120 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.428761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.451611 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.472400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.489531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.507840 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.518060 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.527001 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.531493 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536607 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.551565 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.556176 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557549 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557641 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.572321 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.573550 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579943 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.594391 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.597953 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602123 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.608878 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.615354 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.615521 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.623044 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:32 crc kubenswrapper[4755]: I0320 13:33:32.224901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:32 crc kubenswrapper[4755]: E0320 13:33:32.226115 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225709 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.225865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.226002 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.226590 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.227184 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.227452 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:34 crc kubenswrapper[4755]: I0320 13:33:34.234561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:34 crc kubenswrapper[4755]: E0320 13:33:34.234787 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225289 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225169 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225432 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:36 crc kubenswrapper[4755]: I0320 13:33:36.225131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:36 crc kubenswrapper[4755]: E0320 13:33:36.225377 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:36 crc kubenswrapper[4755]: E0320 13:33:36.356375 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225439 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225518 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225597 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.225777 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.226052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.226419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:38 crc kubenswrapper[4755]: I0320 13:33:38.225699 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:38 crc kubenswrapper[4755]: E0320 13:33:38.226309 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225603 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226042 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226117 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226190 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:40 crc kubenswrapper[4755]: I0320 13:33:40.225412 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:40 crc kubenswrapper[4755]: E0320 13:33:40.225752 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225722 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.225988 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.226189 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.226946 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.259922 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.277987 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.299788 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.318976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.341346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.357749 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.365419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.385393 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.408608 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.425498 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.439860 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.461757 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.477969 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.492383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.514239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.530182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.552599 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.572422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.593461 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.610431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766554 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.784562 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.790007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.790027 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.808910 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813530 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813690 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.829956 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835205 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.852598 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858145 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858168 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.881476 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.881624 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:42 crc kubenswrapper[4755]: I0320 13:33:42.225312 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:42 crc kubenswrapper[4755]: E0320 13:33:42.225578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.225297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.225561 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.225944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.226075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.226116 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.226278 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:44 crc kubenswrapper[4755]: I0320 13:33:44.225702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:44 crc kubenswrapper[4755]: E0320 13:33:44.226113 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.224966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.225269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.225250 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.225514 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.225599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.226088 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.226488 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.226744 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:46 crc kubenswrapper[4755]: I0320 13:33:46.224995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:46 crc kubenswrapper[4755]: E0320 13:33:46.225274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:46 crc kubenswrapper[4755]: E0320 13:33:46.358964 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225437 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.225648 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.225826 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.226066 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:48 crc kubenswrapper[4755]: I0320 13:33:48.225606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:48 crc kubenswrapper[4755]: E0320 13:33:48.225969 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225593 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.225824 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225870 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.226195 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.226281 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:50 crc kubenswrapper[4755]: I0320 13:33:50.224706 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:50 crc kubenswrapper[4755]: E0320 13:33:50.224929 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225453 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.226598 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.226701 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.227338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.361758 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.364554 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podStartSLOduration=139.364539906 podStartE2EDuration="2m19.364539906s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.331356355 +0000 UTC m=+210.929288884" watchObservedRunningTime="2026-03-20 13:33:51.364539906 +0000 UTC m=+210.962472435" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.403693 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=117.403635627 podStartE2EDuration="1m57.403635627s" podCreationTimestamp="2026-03-20 13:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.403529564 +0000 UTC m=+211.001462153" watchObservedRunningTime="2026-03-20 13:33:51.403635627 +0000 UTC m=+211.001568166" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.415162 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=49.415126107 podStartE2EDuration="49.415126107s" podCreationTimestamp="2026-03-20 13:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.414800727 +0000 UTC m=+211.012733286" watchObservedRunningTime="2026-03-20 13:33:51.415126107 +0000 UTC m=+211.013058676" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.431852 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zf67p" podStartSLOduration=139.431826356 podStartE2EDuration="2m19.431826356s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.431028641 +0000 UTC m=+211.028961210" watchObservedRunningTime="2026-03-20 13:33:51.431826356 +0000 UTC m=+211.029758895" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.460199 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=52.460170899 podStartE2EDuration="52.460170899s" podCreationTimestamp="2026-03-20 13:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.459924032 +0000 UTC m=+211.057856581" watchObservedRunningTime="2026-03-20 13:33:51.460170899 +0000 UTC m=+211.058103438" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.475539 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podStartSLOduration=139.475509226 podStartE2EDuration="2m19.475509226s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.474504346 +0000 UTC m=+211.072436895" watchObservedRunningTime="2026-03-20 13:33:51.475509226 +0000 UTC m=+211.073441775" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.578697 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=104.578633898 podStartE2EDuration="1m44.578633898s" podCreationTimestamp="2026-03-20 13:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.555946367 +0000 UTC m=+211.153878926" watchObservedRunningTime="2026-03-20 13:33:51.578633898 +0000 UTC m=+211.176566437" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.624356 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8btvn" podStartSLOduration=139.62433915 podStartE2EDuration="2m19.62433915s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.624013211 +0000 UTC m=+211.221945740" watchObservedRunningTime="2026-03-20 13:33:51.62433915 +0000 UTC m=+211.222271679" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.653741 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b9bt2" podStartSLOduration=139.653711696 podStartE2EDuration="2m19.653711696s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.639638077 +0000 UTC m=+211.237570606" watchObservedRunningTime="2026-03-20 13:33:51.653711696 +0000 UTC m=+211.251644225" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.653901 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" podStartSLOduration=139.653890401 podStartE2EDuration="2m19.653890401s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.652693995 +0000 UTC m=+211.250626524" watchObservedRunningTime="2026-03-20 13:33:51.653890401 +0000 UTC m=+211.251822930" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.667602 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=89.667577128 podStartE2EDuration="1m29.667577128s" podCreationTimestamp="2026-03-20 13:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.667386113 +0000 UTC m=+211.265318642" watchObservedRunningTime="2026-03-20 13:33:51.667577128 +0000 UTC m=+211.265509657" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074802 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:52Z","lastTransitionTime":"2026-03-20T13:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.151409 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7"] Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.152269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.155073 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.155461 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.164425 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.164791 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205461 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205547 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205604 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.225246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:52 crc kubenswrapper[4755]: E0320 13:33:52.225607 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.287768 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.306475 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.306998 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.311591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.314688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.327640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.478174 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.674917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" event={"ID":"0efe9b26-e5b1-4167-bc7e-41c7d836013d","Type":"ContainerStarted","Data":"e9b47cfe036a54290ccb8766855d3184e24f28e174d798b12ce50d5671fb9c0b"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.674974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" event={"ID":"0efe9b26-e5b1-4167-bc7e-41c7d836013d","Type":"ContainerStarted","Data":"70252601887caca1e705617eecd9b0caf1e463f9164adc83eb8f5158ae933436"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.694585 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" podStartSLOduration=140.694556156 podStartE2EDuration="2m20.694556156s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:52.691236385 +0000 UTC m=+212.289168914" watchObservedRunningTime="2026-03-20 13:33:52.694556156 +0000 UTC m=+212.292488715" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.224980 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.225042 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.225093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225165 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225231 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225343 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.225336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:54 crc kubenswrapper[4755]: E0320 13:33:54.225987 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.686915 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687626 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687734 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" exitCode=1 Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687785 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc"} Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687836 4755 scope.go:117] "RemoveContainer" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.688833 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:33:54 crc kubenswrapper[4755]: E0320 13:33:54.689294 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225468 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225565 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225754 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225914 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.246838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.247038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.247152 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247298 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247393 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247366923 +0000 UTC m=+336.845299492 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247493 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247477666 +0000 UTC m=+336.845410235 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247640 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247734 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247720284 +0000 UTC m=+336.845652853 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.348027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.348126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348332 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348400 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348471 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348411 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348537 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348563 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348615 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.348570696 +0000 UTC m=+336.946503385 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348695 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.348636038 +0000 UTC m=+336.946568837 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.695952 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:33:56 crc kubenswrapper[4755]: I0320 13:33:56.224985 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:56 crc kubenswrapper[4755]: E0320 13:33:56.225244 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:56 crc kubenswrapper[4755]: E0320 13:33:56.363114 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231060 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.231272 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.231733 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.232084 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.224776 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:58 crc kubenswrapper[4755]: E0320 13:33:58.226033 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.226241 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.711939 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.716197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.717887 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.777390 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podStartSLOduration=146.777367981 podStartE2EDuration="2m26.777367981s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:58.775467092 +0000 UTC m=+218.373399631" watchObservedRunningTime="2026-03-20 13:33:58.777367981 +0000 UTC m=+218.375300520" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.195626 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.195797 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.195923 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.225676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.225792 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.225852 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.225990 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:00 crc kubenswrapper[4755]: I0320 13:34:00.225247 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:00 crc kubenswrapper[4755]: E0320 13:34:00.226056 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.224879 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.224973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.225073 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227420 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227551 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.363751 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:02 crc kubenswrapper[4755]: I0320 13:34:02.224979 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:02 crc kubenswrapper[4755]: E0320 13:34:02.225426 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225575 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.225707 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.225849 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.226224 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:04 crc kubenswrapper[4755]: I0320 13:34:04.224715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:04 crc kubenswrapper[4755]: E0320 13:34:04.224925 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225590 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225689 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.225887 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.226160 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.226302 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.225188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:06 crc kubenswrapper[4755]: E0320 13:34:06.225410 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.226209 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:34:06 crc kubenswrapper[4755]: E0320 13:34:06.405428 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.754961 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.755072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552"} Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.150284 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224938 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224964 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225156 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225336 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225581 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:08 crc kubenswrapper[4755]: I0320 13:34:08.224862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:08 crc kubenswrapper[4755]: E0320 13:34:08.225130 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.224783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.224783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.224981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.225086 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.225195 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.225315 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:10 crc kubenswrapper[4755]: I0320 13:34:10.224747 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:10 crc kubenswrapper[4755]: E0320 13:34:10.225004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225421 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.227451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.228001 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.227875 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.225371 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.228968 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.229582 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.464248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.530997 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.531714 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532318 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532388 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.533411 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.534161 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.534432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.548006 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.549261 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.550842 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551257 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551417 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551543 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551761 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551902 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.552054 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.552230 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.557310 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.557845 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.558041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.563858 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.568894 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.572328 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.574351 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.575709 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.576305 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.585368 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.602402 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.603301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612206 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611889 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612430 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612018 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612682 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612817 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612950 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612973 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613048 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613267 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612056 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612147 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612185 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612223 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.633149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634043 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634387 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634598 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634776 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634605 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.635056 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637131 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637371 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637529 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637722 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637880 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638861 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639061 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639086 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639218 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639323 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639418 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639426 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639683 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.640803 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.641322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.642899 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643409 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643629 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643863 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644015 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644204 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644835 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.645781 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.650559 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.650877 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.655146 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.676705 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.677308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.677836 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.697116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.697692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698074 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698374 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698501 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698521 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.699804 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.700412 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.706480 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707025 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707766 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707988 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708220 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708310 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708399 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708461 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708572 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708581 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708777 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708913 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709039 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709185 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709594 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711494 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711573 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711785 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711799 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711884 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711889 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711977 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.712003 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714471 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714539 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.715852 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.715918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.717362 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.717599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.718234 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.719192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.724217 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.725590 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731843 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731995 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732116 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732249 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732474 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732579 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732766 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732789 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732820 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733000 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733158 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733182 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.734562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.735882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.737980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.738469 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.738815 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.739855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.740326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.740676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741335 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741642 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741989 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.742704 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.742961 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743487 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743854 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744366 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744870 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.745041 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.774101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.774687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.778267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.778588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.779441 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.779900 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.780294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.784894 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.786426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.787238 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.787377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.788428 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.788703 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.790448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.791321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.792487 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.793444 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.794915 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.800357 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.802574 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.806122 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.808625 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.811741 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.812754 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.813588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.813622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.818148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822369 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822447 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822631 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.823591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.824113 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r48mq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.824948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825282 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825508 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825710 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825719 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827556 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827582 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828763 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.829265 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.830459 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.830505 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.831599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.832508 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833169 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833446 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833582 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833981 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834422 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834442 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834517 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834809 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834845 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835202 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835428 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835448 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.836314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.836955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.839989 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.841368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.842142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.842768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846394 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846590 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847176 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847808 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848234 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.849567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850472 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.854605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.855368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.855562 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.857331 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.858232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.860900 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.863162 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.866148 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.866598 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.867538 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.868200 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.868585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.870885 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.872076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.873975 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.875109 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.875548 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.876014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.876298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.878409 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.878532 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.880008 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.880804 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.881166 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.881767 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.882229 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.883361 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.883829 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2b4nn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.884325 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.884858 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.885333 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.886729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.887956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.889223 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.889810 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.892853 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.894064 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.894460 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.895107 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.897127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.898121 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.899417 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.900735 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.902282 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.902952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.903937 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.904971 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.906267 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.906922 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.908075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.909548 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.909983 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.910169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.911367 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.912627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.913902 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.915159 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.915977 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.916428 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.917231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.918589 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.919572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.920592 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.921467 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.922445 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.923445 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925028 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925936 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.927026 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.928314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.930882 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.934313 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.934516 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.935257 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.935684 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.936864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.938127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.939536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.941333 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.945037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.946589 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.951855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.951988 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952460 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952698 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.953787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.953897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.954778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955287 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956471 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956608 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956680 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956780 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.957683 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.958217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.959789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.959915 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.961516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.962556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.963466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.968866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970759 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.973102 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.976475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.978439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.980880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.981215 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.988978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.995688 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.996550 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.007117 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.017498 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.036137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.055372 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058589 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058609 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.100921 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.116626 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.135177 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.155952 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.176240 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.198869 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.198932 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.222530 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.224989 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.225031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.225210 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.235873 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.241716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.260917 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.275725 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.297636 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.311561 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.316534 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.335744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.351123 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.355097 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.372381 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0bdc24c_c8d3_458e_8cf8_d91164ef2b9d.slice/crio-39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970 WatchSource:0}: Error finding container 39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970: Status 404 returned error can't find the container with id 39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.376087 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.395566 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.416364 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.435324 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.456760 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.476005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.496911 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.516093 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.535130 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.555437 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.576069 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.585634 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.586687 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.595770 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.615195 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.635359 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.655598 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41fdebf_1886_4b30_b583_368242316562.slice/crio-5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab WatchSource:0}: Error finding container 5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab: Status 404 returned error can't find the container with id 5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.656001 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c8ae1f_e5a9_4ac8_8af7_2169378af3d2.slice/crio-3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198 WatchSource:0}: Error finding container 3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198: Status 404 returned error can't find the container with id 3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.661881 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.677001 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.697476 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.716631 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.736435 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.756545 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.778327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.787195 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.795089 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.801182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.808564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" event={"ID":"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec","Type":"ContainerStarted","Data":"ba751e571165d321957db0bbe94ca3f2e5f473370ae72a84257ecb497533f74a"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.808627 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" event={"ID":"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec","Type":"ContainerStarted","Data":"5672dda1d5c9b160a4b7b486a54671b0648f1faefc7e70e38fa9fc14f2d3dc47"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.810983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerStarted","Data":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.811070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerStarted","Data":"ea03e21c825372e4f508e4183f07bab9440aa36d8af7963578ed0bad5bcf3f8f"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.811814 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.813544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerStarted","Data":"3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815511 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pz64x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815572 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"45890e5f4598060274cae5d6900124de864ae12693826b146db8417bf5528c14"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"75d986ee8bd4078a821be23754658900dbf0ee509d3df8503a35488b3943426a"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"c1799172fcbf1d7e30ac93fa777d3291701003b3efce3a764bd729a9bf9db293"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818399 4755 generic.go:334] "Generic (PLEG): container finished" podID="45c99095-eab0-49c4-8ded-fc5359b43ef2" containerID="878f489e5b623b310bd55ab79b117a4c08d12b9f37999d42abdfb061f8c34b5d" exitCode=0 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerDied","Data":"878f489e5b623b310bd55ab79b117a4c08d12b9f37999d42abdfb061f8c34b5d"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818702 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerStarted","Data":"098c7922b1ce945e2390d6d1d82aeb2786b9b7ba4510d95d38765df68187a963"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.821078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.823646 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" event={"ID":"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d","Type":"ContainerStarted","Data":"ff675657f16815ecc86e4757b1af8ea00efd122872cb67c077719f8d26daf593"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.823709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" event={"ID":"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d","Type":"ContainerStarted","Data":"39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.836067 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.853864 4755 request.go:700] Waited for 1.018642805s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.877368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.894463 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.916543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.931439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.954406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.969672 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.976140 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.987970 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.994393 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.996973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.006899 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.016378 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.018574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.024705 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.036505 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.044464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.046116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.056973 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.058983 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059089 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert podName:673ae012-3e48-4408-8a01-a67833cabd26 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:14.559063942 +0000 UTC m=+234.156996471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert") pod "catalog-operator-68c6474976-h4jf2" (UID: "673ae012-3e48-4408-8a01-a67833cabd26") : failed to sync secret cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059272 4755 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059941 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume podName:4b90540c-9ef1-478a-a7a1-48817d0c63d0 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:14.559930348 +0000 UTC m=+234.157862877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume") pod "collect-profiles-29566890-k7h8h" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.077007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.095769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.118743 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.139638 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.156553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.175391 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.198148 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.216554 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.221197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.236529 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.257133 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.264221 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.276794 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.289738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.297156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.299117 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.314714 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefeb6afa_e175_4bad_a0bb_5ace61619959.slice/crio-68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640 WatchSource:0}: Error finding container 68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640: Status 404 returned error can't find the container with id 68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640 Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.317251 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fdd6691_9136_43ba_abea_7ba6862e9681.slice/crio-ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89 WatchSource:0}: Error finding container ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89: Status 404 returned error can't find the container with id ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.317291 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.335568 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.356604 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.365943 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.376984 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.393897 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8d45e6_cf9d_4f6f_b459_efe220bbf6d8.slice/crio-d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d WatchSource:0}: Error finding container d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d: Status 404 returned error can't find the container with id d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.395713 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.416740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.437090 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.447215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.458129 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.460181 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f67c724_386b_4736_ace1_73430edd3558.slice/crio-6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753 WatchSource:0}: Error finding container 6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753: Status 404 returned error can't find the container with id 6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.476426 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.497182 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.516330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.537637 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.556181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.569988 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.575456 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.585216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.585320 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.586243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.593033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.597133 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.616106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.637581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.655091 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54506: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.666729 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.696992 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.715059 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.745013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.750798 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54522: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.754737 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.776889 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.793073 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54528: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.795398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.820940 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.831286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"88aa621b295c782e88050d005ee25a67f87afd469bce1ac8a057660b2326fafa"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.831354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.832914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"96510e9c3614e05b5cac667c321981d56cc16b49515f280c0404f0f02d7e0f61"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.832939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"d67a003c3dab3abc4dedddd843659ac5a0416d89137a7d6ebf9505449b28b2c2"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.834677 4755 generic.go:334] "Generic (PLEG): container finished" podID="b41fdebf-1886-4b30-b583-368242316562" containerID="a87ca0fc084eebae1508d13df991c9fdf881a46c04c1ca309ae21ad137c9ac71" exitCode=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.834724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerDied","Data":"a87ca0fc084eebae1508d13df991c9fdf881a46c04c1ca309ae21ad137c9ac71"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"0c703fd3704b0dc3498707636368a0438967cba95c29e00b3d89f6b05a93ae25"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"9f31d7a336050576f16fb2c470c156191647fe965629887bacb3403c6fce0422"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.840388 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.842337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerStarted","Data":"275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.842368 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerStarted","Data":"9edc35520733cdbb8ffbbdcc2f02ec6ef4e5e7ada3cc88f2fa7d388e53bb80dd"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.843046 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.844995 4755 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wpj5p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.845063 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.846644 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l4v7x" event={"ID":"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8","Type":"ContainerStarted","Data":"915eb530da816e9e17072047fc4a554dcb3ad3d7ada42dbbd3ff4edffb783862"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.846722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l4v7x" event={"ID":"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8","Type":"ContainerStarted","Data":"d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.847783 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.848944 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.848986 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.850955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerStarted","Data":"5f61f4cc6b6288ecfe2b511eff06c1b8d1e7228a39df7183bd2343f89dcfafec"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.857098 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54544: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.858004 4755 request.go:700] Waited for 1.912358624s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerStarted","Data":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861353 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861835 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864435 4755 generic.go:334] "Generic (PLEG): container finished" podID="efeb6afa-e175-4bad-a0bb-5ace61619959" containerID="5ff09c4938c795e7be70b5bea1bdf0083d2e9b549cb47503854dc79b5ee1c65a" exitCode=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerDied","Data":"5ff09c4938c795e7be70b5bea1bdf0083d2e9b549cb47503854dc79b5ee1c65a"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864987 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerStarted","Data":"68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.875042 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.880345 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.931187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.935841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.938060 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.963254 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.975290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.995280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.996086 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54552: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.002127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.025228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.046323 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.062928 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.083063 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54566: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.084056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.107571 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.133696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.137675 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.141624 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.154600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.181282 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54580: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.186424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.186488 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.200037 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.219337 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.256715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.269812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.278119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.296230 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306643 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306712 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306797 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306848 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306886 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307020 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307037 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307598 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309533 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310147 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313595 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314086 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314109 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.315155 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.815142081 +0000 UTC m=+235.413074610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.361539 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54582: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.385056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.394992 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-l4v7x" podStartSLOduration=163.394967753 podStartE2EDuration="2m43.394967753s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:15.33677151 +0000 UTC m=+234.934704049" watchObservedRunningTime="2026-03-20 13:34:15.394967753 +0000 UTC m=+234.992900282" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.418857 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419592 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.449001 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.948964298 +0000 UTC m=+235.546896827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456252 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456378 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456899 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456921 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457082 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457197 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457237 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457323 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457447 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457713 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457743 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457941 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.458021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.459563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.474603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.476669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.477075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.477382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.478366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.480965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.482020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.484162 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.492435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.492553 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.494741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.496427 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.996409414 +0000 UTC m=+235.594341943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.497074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.498543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499451 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.500191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.500227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.506355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.507951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.510931 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.515450 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.518855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.523158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.526629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.526698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.528136 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.528548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.530096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.530951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.531455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.531584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532501 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.533689 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.534121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.539832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.540468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.559587 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.059557457 +0000 UTC m=+235.657489986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559806 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559828 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560052 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.561001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.563100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.564173 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.064125016 +0000 UTC m=+235.662057625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.571717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.572563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.573352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.616143 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.622083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.628087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.631542 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.647825 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.648001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.655227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.662187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.663012 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.162985578 +0000 UTC m=+235.760918107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.673518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.678248 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.685577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.699994 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.719369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.723142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.726043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.768785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.769175 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.269160143 +0000 UTC m=+235.867092672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.769547 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.779890 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.793970 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.804116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.824584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.825520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.831341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.842346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.851090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.858012 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.875803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.876175 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.877758 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.377731771 +0000 UTC m=+235.975664300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.878508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.885116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: W0320 13:34:15.885756 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b65e162_155e_4d40_ab1a_e3560b29f19f.slice/crio-8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b WatchSource:0}: Error finding container 8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b: Status 404 returned error can't find the container with id 8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.894499 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.898400 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.904354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"d9510fbd56632079c0f63151bc7c72dd4b0c1b7c506093667a48f1b183b8afe1"} Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.907348 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.913900 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.923523 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.939735 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.940831 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.951428 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.981035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.981765 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.481725729 +0000 UTC m=+236.079658258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.992047 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerStarted","Data":"a883711492469aba5080025f39ee56d456d68c7d62a0b2da2289bad36e4ed8ea"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.005578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.017035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" event={"ID":"862bcbad-0c15-4e2d-b205-83ab3721cd9a","Type":"ContainerStarted","Data":"039f9c39c017712ba47cb122b137bf255d2e7b389297efbc6e37c0a64c24630c"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.039946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"1caf15d648ab356ba8ce29ae6656d1d61a4848069f486150e3c30cb8afffaf10"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.052481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.068008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"136c9719504ddddb3a72818721408e17db6a72d62ab8121d2be5854433e1e5af"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.078489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.082322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.086951 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54598: no serving certificate available for the kubelet" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.088227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerStarted","Data":"c4cff8e8d91f9f04f81c77d0489fb00ab6a37272797923c9c73d89dd5bbdff5d"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.091646 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.091781 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.093198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.100084 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" podStartSLOduration=164.100053264 podStartE2EDuration="2m44.100053264s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.094563577 +0000 UTC m=+235.692496106" watchObservedRunningTime="2026-03-20 13:34:16.100053264 +0000 UTC m=+235.697985793" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.108114 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.608066928 +0000 UTC m=+236.205999457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.189694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.196392 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.696371239 +0000 UTC m=+236.294303768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.293081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.293636 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.793617471 +0000 UTC m=+236.391550000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.313122 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" podStartSLOduration=164.313101325 podStartE2EDuration="2m44.313101325s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.309609619 +0000 UTC m=+235.907542148" watchObservedRunningTime="2026-03-20 13:34:16.313101325 +0000 UTC m=+235.911033854" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.341171 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.376586 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podStartSLOduration=164.376566189 podStartE2EDuration="2m44.376566189s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.355761845 +0000 UTC m=+235.953694374" watchObservedRunningTime="2026-03-20 13:34:16.376566189 +0000 UTC m=+235.974498718" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.395226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.395583 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.895572288 +0000 UTC m=+236.493504817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.414505 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.426368 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.490324 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" podStartSLOduration=164.490302674 podStartE2EDuration="2m44.490302674s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.471505901 +0000 UTC m=+236.069438420" watchObservedRunningTime="2026-03-20 13:34:16.490302674 +0000 UTC m=+236.088235203" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.497037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.497708 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.997690409 +0000 UTC m=+236.595622938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.531447 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.550772 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.553566 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podStartSLOduration=164.55354181 podStartE2EDuration="2m44.55354181s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.541841184 +0000 UTC m=+236.139773713" watchObservedRunningTime="2026-03-20 13:34:16.55354181 +0000 UTC m=+236.151474339" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.558830 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.570678 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.600123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.600718 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.100703838 +0000 UTC m=+236.698636357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: W0320 13:34:16.637863 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d33b23_44ac_48b5_8981_fe9a764b1bee.slice/crio-3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46 WatchSource:0}: Error finding container 3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46: Status 404 returned error can't find the container with id 3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46 Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.711426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.711759 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.211732391 +0000 UTC m=+236.809664920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.712301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.712698 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.212680959 +0000 UTC m=+236.810613488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: W0320 13:34:16.761847 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b90540c_9ef1_478a_a7a1_48817d0c63d0.slice/crio-d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65 WatchSource:0}: Error finding container d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65: Status 404 returned error can't find the container with id d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65 Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.815948 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.816662 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.316631816 +0000 UTC m=+236.914564345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.833227 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podStartSLOduration=163.833200201 podStartE2EDuration="2m43.833200201s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.808834678 +0000 UTC m=+236.406767197" watchObservedRunningTime="2026-03-20 13:34:16.833200201 +0000 UTC m=+236.431132730" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.836892 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.910370 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.918884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.919319 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.419304855 +0000 UTC m=+237.017237384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.023444 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.036337 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.026269 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.526245682 +0000 UTC m=+237.124178221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.026153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.037043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.037732 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.537712192 +0000 UTC m=+237.135644721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.058234 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.077639 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" podStartSLOduration=165.077611207 podStartE2EDuration="2m45.077611207s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.075773752 +0000 UTC m=+236.673706291" watchObservedRunningTime="2026-03-20 13:34:17.077611207 +0000 UTC m=+236.675543736" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.110326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.168778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.171399 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.171511 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.671486108 +0000 UTC m=+237.269418647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.171935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.172294 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.672268621 +0000 UTC m=+237.270201150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.181934 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.190506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" event={"ID":"8b27e760-b22d-415a-93cc-866c2471ee63","Type":"ContainerStarted","Data":"52e70f2c08587f234542371eb63df44cff34c53fc242963b2b61f887897d2d3a"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.214163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"7fdfa8cc5c6e3a309f83f429b136914c6b11eb49f4f2ee209076e8791b5023e5"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.235703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerStarted","Data":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.256345 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.269449 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" event={"ID":"1b9cfbce-3f17-4155-a022-243e6d220bf8","Type":"ContainerStarted","Data":"4b461d0707325d4a02f122a17e58f9e5a84ef93592d659cd6f0a39fa66be7fd6"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.272814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.274248 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.774224658 +0000 UTC m=+237.372157177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.288199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" event={"ID":"673ae012-3e48-4408-8a01-a67833cabd26","Type":"ContainerStarted","Data":"4836923cd199d8b812b8adba92d36b4353a73b0cb19ba9b3e51b5138b611783f"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.305703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"2ee3f579f873566cac82cfed106e8f9f2d43da1d9062ee0e8d7200af628eac14"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.309239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" event={"ID":"862bcbad-0c15-4e2d-b205-83ab3721cd9a","Type":"ContainerStarted","Data":"ba74282b9aff06f76885c33bcf2503d8ea4ba5de4cc982b7c5f67a676592536e"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.360367 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r48mq" event={"ID":"8b65e162-155e-4d40-ab1a-e3560b29f19f","Type":"ContainerStarted","Data":"8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.367937 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.377426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.379461 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.879433293 +0000 UTC m=+237.477365902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.380002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"9a0dc9512c594f8b5847c36bf313d8eebf2ece5ed2e39f0f3859a4f9db6478b4"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.381707 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2b4nn" event={"ID":"5f249077-e650-4ad5-b008-7af17910535a","Type":"ContainerStarted","Data":"13eb28fc506d56eb2a8ad7e822339124caa3dc1366c6165033d1f585862c3d02"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.383081 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" podStartSLOduration=165.383068973 podStartE2EDuration="2m45.383068973s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.37309634 +0000 UTC m=+236.971028869" watchObservedRunningTime="2026-03-20 13:34:17.383068973 +0000 UTC m=+236.981001502" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.411077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" event={"ID":"c9f788c2-2578-4d14-9c8f-115f15a5a817","Type":"ContainerStarted","Data":"73959380ea3e1f8bd230283269d87862536d87ab6399359926234c24e12e021d"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.431549 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54288: no serving certificate available for the kubelet" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.441449 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerStarted","Data":"d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.457364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" event={"ID":"8ff5ba16-93f9-4313-a857-23a1c87c1cac","Type":"ContainerStarted","Data":"77353dcb09a108108eb7a0aefc5e69dc9220ee5f614fd242f9b082fbca5c1150"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.459379 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.459423 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.493010 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.499675 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.999628915 +0000 UTC m=+237.597561444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.573098 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.607141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.608479 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.108459301 +0000 UTC m=+237.706391830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.709803 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" podStartSLOduration=165.709781448 podStartE2EDuration="2m45.709781448s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.709012445 +0000 UTC m=+237.306944974" watchObservedRunningTime="2026-03-20 13:34:17.709781448 +0000 UTC m=+237.307713977" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.710265 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.210238501 +0000 UTC m=+237.808171020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.710170 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.711025 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.711422 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.211413868 +0000 UTC m=+237.809346397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.790525 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rb5zn" podStartSLOduration=165.790499857 podStartE2EDuration="2m45.790499857s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.746068863 +0000 UTC m=+237.344001392" watchObservedRunningTime="2026-03-20 13:34:17.790499857 +0000 UTC m=+237.388432386" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.812316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.813129 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.313092725 +0000 UTC m=+237.911025284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.835111 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" podStartSLOduration=165.835086676 podStartE2EDuration="2m45.835086676s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.791075634 +0000 UTC m=+237.389008173" watchObservedRunningTime="2026-03-20 13:34:17.835086676 +0000 UTC m=+237.433019205" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.880651 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.883221 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.897685 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" podStartSLOduration=165.897645511 podStartE2EDuration="2m45.897645511s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.897215378 +0000 UTC m=+237.495147907" watchObservedRunningTime="2026-03-20 13:34:17.897645511 +0000 UTC m=+237.495578040" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.920071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.921708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.923153 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.423133678 +0000 UTC m=+238.021066207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.982468 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" podStartSLOduration=165.982445845 podStartE2EDuration="2m45.982445845s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.980051131 +0000 UTC m=+237.577983660" watchObservedRunningTime="2026-03-20 13:34:17.982445845 +0000 UTC m=+237.580378374" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.990154 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.990493 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.022772 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.023537 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.523519966 +0000 UTC m=+238.121452495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.069929 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" podStartSLOduration=166.06990617 podStartE2EDuration="2m46.06990617s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.023871657 +0000 UTC m=+237.621804186" watchObservedRunningTime="2026-03-20 13:34:18.06990617 +0000 UTC m=+237.667838699" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.110070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.124801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.125852 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.625832624 +0000 UTC m=+238.223765153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.141495 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.212468 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.225940 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.226601 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.726586783 +0000 UTC m=+238.324519312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.333536 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.334467 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.834446569 +0000 UTC m=+238.432379098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.445885 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.446251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.446679 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.946646088 +0000 UTC m=+238.544578617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.491179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.512984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2b4nn" event={"ID":"5f249077-e650-4ad5-b008-7af17910535a","Type":"ContainerStarted","Data":"c150f0bbdb95baf7383c53f6a3f6a014b41ffd0c8068ffbdf9a0de37fc2d3222"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.518364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r48mq" event={"ID":"8b65e162-155e-4d40-ab1a-e3560b29f19f","Type":"ContainerStarted","Data":"7ed4d9d1329bb8b20ecfe08f82bd47710ddeaf41ded2652e97255ca0e97a27d0"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.549129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.551616 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.051596656 +0000 UTC m=+238.649529185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.564292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerStarted","Data":"87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.591843 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.591939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" event={"ID":"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9","Type":"ContainerStarted","Data":"2a9190782406b02779e67dc2a7bd5c76d522825551f1d3835561e955bd4878a3"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.610222 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerStarted","Data":"d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.617044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" event={"ID":"1b9cfbce-3f17-4155-a022-243e6d220bf8","Type":"ContainerStarted","Data":"fb51a73fc7c80806a2832de9fd14858918cffb026fa6c342f519a7b50a828454"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.634188 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2b4nn" podStartSLOduration=6.6341639610000005 podStartE2EDuration="6.634163961s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.555214615 +0000 UTC m=+238.153147144" watchObservedRunningTime="2026-03-20 13:34:18.634163961 +0000 UTC m=+238.232096480" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647571 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647689 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:18 crc kubenswrapper[4755]: W0320 13:34:18.658293 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf19a889_4a85_42c6_aafa_6714754c5a86.slice/crio-45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e WatchSource:0}: Error finding container 45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e: Status 404 returned error can't find the container with id 45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.677808 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.681053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.688681 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.694220 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.194159508 +0000 UTC m=+238.792092037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.706220 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:18 crc kubenswrapper[4755]: W0320 13:34:18.712375 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56ded58_9184_4d39_b422_9ea9e8f6b9ea.slice/crio-73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11 WatchSource:0}: Error finding container 73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11: Status 404 returned error can't find the container with id 73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11 Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.759062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"c95ecfcf06991a49718f0ea2768a0a1b8bf011adcc8fa84dee5a421e761d426b"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.769109 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.771390 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r48mq" podStartSLOduration=166.771366571 podStartE2EDuration="2m46.771366571s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.606990862 +0000 UTC m=+238.204923391" watchObservedRunningTime="2026-03-20 13:34:18.771366571 +0000 UTC m=+238.369299100" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.777797 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" podStartSLOduration=166.777770056 podStartE2EDuration="2m46.777770056s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.659772401 +0000 UTC m=+238.257704950" watchObservedRunningTime="2026-03-20 13:34:18.777770056 +0000 UTC m=+238.375702585" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.778397 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.786980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" event={"ID":"27e50d13-5c93-4dd7-a2c8-7ba505e2f549","Type":"ContainerStarted","Data":"665676f9bda8b22269ecdde850f1c1d577f3cf41c3859caa40359e6bf9f18eae"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.795446 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" event={"ID":"c9f788c2-2578-4d14-9c8f-115f15a5a817","Type":"ContainerStarted","Data":"98b7aec7ba10ef671338e0b35d8f190727c8b67519161949b38364b314942d6d"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.812600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.813850 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.313832374 +0000 UTC m=+238.911764893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.828781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" event={"ID":"eb9a014d-9a58-4461-adc6-2ee3981782a3","Type":"ContainerStarted","Data":"fb76caa1d2db3e98fe4b185f43deb98fbe90661bbf00761b719fc977a3879221"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.828853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" event={"ID":"eb9a014d-9a58-4461-adc6-2ee3981782a3","Type":"ContainerStarted","Data":"af29aee6314a51e0e0f3feacac2582b11ae13af8a02aa2e1e04d014d72c97020"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.829419 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.832576 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" podStartSLOduration=166.832553035 podStartE2EDuration="2m46.832553035s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.832298187 +0000 UTC m=+238.430230736" watchObservedRunningTime="2026-03-20 13:34:18.832553035 +0000 UTC m=+238.430485564" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.837455 4755 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rtzzb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.837529 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" podUID="eb9a014d-9a58-4461-adc6-2ee3981782a3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.850967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" event={"ID":"85fb2982-9af0-4450-80f4-12fbd6e7a590","Type":"ContainerStarted","Data":"9fb27c6045547886f781e289c220e8d084d39fa06f1d39729ca9db18c3b8c0e1"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.851050 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" event={"ID":"85fb2982-9af0-4450-80f4-12fbd6e7a590","Type":"ContainerStarted","Data":"7a561f5bbe91e5fed7457d49d2a9ae216e382d0f9f3884fb58722e37f3431ce9"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.866695 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" podStartSLOduration=166.866641233 podStartE2EDuration="2m46.866641233s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.853033249 +0000 UTC m=+238.450965778" watchObservedRunningTime="2026-03-20 13:34:18.866641233 +0000 UTC m=+238.464573762" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.881959 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" podStartSLOduration=166.881938539 podStartE2EDuration="2m46.881938539s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.877975309 +0000 UTC m=+238.475907828" watchObservedRunningTime="2026-03-20 13:34:18.881938539 +0000 UTC m=+238.479871058" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.887389 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" event={"ID":"8ff5ba16-93f9-4313-a857-23a1c87c1cac","Type":"ContainerStarted","Data":"be70966701d6cda176ee303d4d649e64a8e74c12777e4d4ee09a445be3d6ed5e"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.889605 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.903928 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-6zgr4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.903991 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podUID="8ff5ba16-93f9-4313-a857-23a1c87c1cac" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.909489 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" event={"ID":"673ae012-3e48-4408-8a01-a67833cabd26","Type":"ContainerStarted","Data":"151e814fabd7eeb84bc7535b6301244d51581b430e1a329c2d08cd114110497b"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.910307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.914413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.919286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"17bd9204ad3a0d50217b0efa90a5d14971587a5d314e7592eccdade678490148"} Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.919298 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.419273487 +0000 UTC m=+239.017206016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.931222 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podStartSLOduration=166.9312027 podStartE2EDuration="2m46.9312027s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.928110976 +0000 UTC m=+238.526043505" watchObservedRunningTime="2026-03-20 13:34:18.9312027 +0000 UTC m=+238.529135229" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.934521 4755 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h4jf2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.934580 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" podUID="673ae012-3e48-4408-8a01-a67833cabd26" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.942245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"ab6e8363864b1a804ddc99c0ed8403ec23675067240e77300a29abe09652e472"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.942305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"8ece1944cd19e7ac1137ed829e08f920fd86a6ab247cd9d747859a2dfa0c9e1c"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.949142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"8fc491215d42b479f9ba11e0ab94bf7f1a250c67800ed6bd25e629cda38bf227"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.953353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" event={"ID":"8b27e760-b22d-415a-93cc-866c2471ee63","Type":"ContainerStarted","Data":"e985a14b8183b03388e3baee0985df9660fce2dd7c4cf09d74c9a75004b92046"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.955438 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerStarted","Data":"c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.961053 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" podStartSLOduration=166.96103581 podStartE2EDuration="2m46.96103581s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.959680768 +0000 UTC m=+238.557613297" watchObservedRunningTime="2026-03-20 13:34:18.96103581 +0000 UTC m=+238.558968339" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.003116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" event={"ID":"c5e0183e-e0f5-4b89-a2f9-27fc07783e27","Type":"ContainerStarted","Data":"ee1bd8b4abe872c704afebc0bff51de7c8acb2686a5e91fbbc3df0cf4c48714b"} Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.003299 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.016280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.017235 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.019550 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.519518372 +0000 UTC m=+239.117450901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.026368 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" podStartSLOduration=167.026338269 podStartE2EDuration="2m47.026338269s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.985160394 +0000 UTC m=+238.583092923" watchObservedRunningTime="2026-03-20 13:34:19.026338269 +0000 UTC m=+238.624270798" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.034096 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m66xn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.034259 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podUID="c5e0183e-e0f5-4b89-a2f9-27fc07783e27" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.047208 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.048811 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" podStartSLOduration=167.048788873 podStartE2EDuration="2m47.048788873s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.047604997 +0000 UTC m=+238.645537526" watchObservedRunningTime="2026-03-20 13:34:19.048788873 +0000 UTC m=+238.646721402" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.049065 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" podStartSLOduration=167.049059741 podStartE2EDuration="2m47.049059741s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.0142078 +0000 UTC m=+238.612140329" watchObservedRunningTime="2026-03-20 13:34:19.049059741 +0000 UTC m=+238.646992270" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.120009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.121946 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.62190839 +0000 UTC m=+239.219840909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.168871 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podStartSLOduration=167.168850711 podStartE2EDuration="2m47.168850711s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.160037643 +0000 UTC m=+238.757970192" watchObservedRunningTime="2026-03-20 13:34:19.168850711 +0000 UTC m=+238.766783240" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.223037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.224377 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.724354042 +0000 UTC m=+239.322286571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.288148 4755 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6ql2s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]log ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]etcd ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/max-in-flight-filter ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:34:19 crc kubenswrapper[4755]: livez check failed Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.288242 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" podUID="b41fdebf-1886-4b30-b583-368242316562" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.329403 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.329568 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.829542326 +0000 UTC m=+239.427474855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.329703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.330042 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.830029951 +0000 UTC m=+239.427962480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.434519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.434647 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.934629648 +0000 UTC m=+239.532562177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.435055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.435559 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.935550536 +0000 UTC m=+239.533483065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.537340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.537528 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.037500702 +0000 UTC m=+239.635433231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.538223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.539389 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.039376649 +0000 UTC m=+239.637309368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.639759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.647099 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.147068611 +0000 UTC m=+239.745001140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.647322 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.647896 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.147885426 +0000 UTC m=+239.745817945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.668979 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:19 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.669083 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.752680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.753019 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.252977117 +0000 UTC m=+239.850909646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.753203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.753734 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.25372583 +0000 UTC m=+239.851658359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.854753 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.855426 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.355368417 +0000 UTC m=+239.953300956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.856124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.856623 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.356596564 +0000 UTC m=+239.954529093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.957401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.957583 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.457548839 +0000 UTC m=+240.055481368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.958001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.958441 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.458425007 +0000 UTC m=+240.056357536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.059186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.059438 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.559399053 +0000 UTC m=+240.157331582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.059597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.060039 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.560023202 +0000 UTC m=+240.157955721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.065637 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54302: no serving certificate available for the kubelet" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.087162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" event={"ID":"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9","Type":"ContainerStarted","Data":"074f11e0e9f7a406a73c13235fca4b755abec8ffe9f594e1827d35673d23f526"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.092882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"26c43a8cf27365109f74a654174b902d4ce3f9384b9e7037104956dadbda0dcf"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.092940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.105618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"34e6784586bb9dbd83ead2f26100ea97ea937ff50931de64353977525ad70705"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.117334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"7e379a11cc9f18b070bfba4bf9f24341cf71c9ebadf67f86b32d424ed7cbfe6f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.124142 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" podStartSLOduration=167.124097134 podStartE2EDuration="2m47.124097134s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.122946038 +0000 UTC m=+239.720878567" watchObservedRunningTime="2026-03-20 13:34:20.124097134 +0000 UTC m=+239.722029663" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.136029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"562703a64193302ee027aec6434993d311409931f91424a8a3bc7f45352d8b93"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.174883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.177388 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.677364467 +0000 UTC m=+240.275296996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.181952 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" event={"ID":"0e40521a-c254-4fd5-99e8-1296dd288e2d","Type":"ContainerStarted","Data":"1527d02eab2d2272d4b2383ebbd8ffe7459e690c29c19db34298ea1367d33353"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.182006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" event={"ID":"0e40521a-c254-4fd5-99e8-1296dd288e2d","Type":"ContainerStarted","Data":"36bf03bc21c96c081022e5457100171ea34a5f0c7e3891bd2a94a15d54fc11ec"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.195732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"c2e934a6dd55c7072b44da340f236e922e8823aa86ab19cc0bd48fdeb7c98ad1"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.195779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"d39ca1909e22b17c2b346fc275569453aa15b6c7c78769bd4ee043798e72356f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.207599 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" podStartSLOduration=168.207571798 podStartE2EDuration="2m48.207571798s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.206827794 +0000 UTC m=+239.804760323" watchObservedRunningTime="2026-03-20 13:34:20.207571798 +0000 UTC m=+239.805504327" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.207765 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" podStartSLOduration=168.207759103 podStartE2EDuration="2m48.207759103s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.174288243 +0000 UTC m=+239.772220772" watchObservedRunningTime="2026-03-20 13:34:20.207759103 +0000 UTC m=+239.805691642" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.213434 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"b569574f673b8dcaa9e3744f50b218f55339a7ccb1cb77efb3cb16f9547b1966"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.213497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"a1e27ddb2360de2885346f06685fb71d108322ddf8db9a5aab39b6a0caaeb336"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.229600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" event={"ID":"d8def433-c490-4469-9e43-12ba06428091","Type":"ContainerStarted","Data":"d8f4b5903ec1b8d49a7f0244972cd97783793a7d76ccefe2e5e34e7d48888409"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.229673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" event={"ID":"d8def433-c490-4469-9e43-12ba06428091","Type":"ContainerStarted","Data":"cd02889b8f102d407e8cda11aebe9cac21194831626783b6dc4045596519d058"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"165a7083062ced917c24d885b7a0d89257d3da32d002f75a9e3c209f10e72f59"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"d5827cbd34536cc1d64209344c68c388f554f3d5f6e7218403fb14d4bdf44d46"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"06c181e0fbc16493dcf66776b27278ff10f19457ee1e23f63560cbac303f5743"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.253015 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.271461 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" podStartSLOduration=168.271441763 podStartE2EDuration="2m48.271441763s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.268007138 +0000 UTC m=+239.865939667" watchObservedRunningTime="2026-03-20 13:34:20.271441763 +0000 UTC m=+239.869374292" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.274876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r96g9" event={"ID":"51b44b44-8a09-430a-ba3c-92e2c2f916f6","Type":"ContainerStarted","Data":"6a470248db5bcdcbd851ec9d93aeae048e560d5e4f3964642d2bb0e261783b5f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.274932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r96g9" event={"ID":"51b44b44-8a09-430a-ba3c-92e2c2f916f6","Type":"ContainerStarted","Data":"1523344c4416ca2fa6a9d36b485c0350c987b557609e59798908e6ef70e640c3"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.277400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.280114 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.780088727 +0000 UTC m=+240.378021456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerStarted","Data":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerStarted","Data":"ffd17bcea5582e9144ff86b2de342c1b3c61951742cefde886baf98d6e66252d"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291778 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.293700 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-229g6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.293752 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.308988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"e40bc158eb9411a59253f82e1e8bcbc74fd16a747555f5dfd4f844a70cd069a7"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.357177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" event={"ID":"27e50d13-5c93-4dd7-a2c8-7ba505e2f549","Type":"ContainerStarted","Data":"5a34fdf03efebf84f169fb8b7aa970bfd47809ef847318147656f3236950a798"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.385329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.387185 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.887161669 +0000 UTC m=+240.485094198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.400472 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" event={"ID":"c5e0183e-e0f5-4b89-a2f9-27fc07783e27","Type":"ContainerStarted","Data":"e8aa715492c4c86ff9bf4336ddb65914dd8f892dbd26482bc316444af61d8881"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.446792 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" podStartSLOduration=168.446759975 podStartE2EDuration="2m48.446759975s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.29959371 +0000 UTC m=+239.897526239" watchObservedRunningTime="2026-03-20 13:34:20.446759975 +0000 UTC m=+240.044692504" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.448339 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" event={"ID":"b56ded58-9184-4d39-b422-9ea9e8f6b9ea","Type":"ContainerStarted","Data":"c9390327472cf673d0b59be92f646bdef72fc5b3ee002719db11b571fd677600"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.448398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" event={"ID":"b56ded58-9184-4d39-b422-9ea9e8f6b9ea","Type":"ContainerStarted","Data":"73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.450328 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-6zgr4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.450383 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podUID="8ff5ba16-93f9-4313-a857-23a1c87c1cac" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.471534 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" podStartSLOduration=168.471516539 podStartE2EDuration="2m48.471516539s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.426033233 +0000 UTC m=+240.023965762" watchObservedRunningTime="2026-03-20 13:34:20.471516539 +0000 UTC m=+240.069449068" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.490788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.491471 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.991449396 +0000 UTC m=+240.589381925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.495216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.510819 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" podStartSLOduration=168.510790385 podStartE2EDuration="2m48.510790385s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.474927533 +0000 UTC m=+240.072860062" watchObservedRunningTime="2026-03-20 13:34:20.510790385 +0000 UTC m=+240.108722914" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.512964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.581450 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r96g9" podStartSLOduration=8.581424957 podStartE2EDuration="8.581424957s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.513179418 +0000 UTC m=+240.111111947" watchObservedRunningTime="2026-03-20 13:34:20.581424957 +0000 UTC m=+240.179357486" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.593452 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.595712 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.095679871 +0000 UTC m=+240.693612390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.627411 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podStartSLOduration=168.627377427 podStartE2EDuration="2m48.627377427s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.622130438 +0000 UTC m=+240.220062967" watchObservedRunningTime="2026-03-20 13:34:20.627377427 +0000 UTC m=+240.225309956" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.636142 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" podStartSLOduration=168.636102184 podStartE2EDuration="2m48.636102184s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.583259703 +0000 UTC m=+240.181192222" watchObservedRunningTime="2026-03-20 13:34:20.636102184 +0000 UTC m=+240.234034713" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.652022 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:20 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:20 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:20 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.652133 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.696690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.698606 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.198565946 +0000 UTC m=+240.796498475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.749547 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" podStartSLOduration=167.749520709 podStartE2EDuration="2m47.749520709s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.701759263 +0000 UTC m=+240.299691792" watchObservedRunningTime="2026-03-20 13:34:20.749520709 +0000 UTC m=+240.347453238" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.801329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.801933 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.301917735 +0000 UTC m=+240.899850264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.836809 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" podStartSLOduration=168.836779247 podStartE2EDuration="2m48.836779247s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.828569887 +0000 UTC m=+240.426502416" watchObservedRunningTime="2026-03-20 13:34:20.836779247 +0000 UTC m=+240.434711776" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.903690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.904064 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.404049906 +0000 UTC m=+241.001982435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.006478 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.007469 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.507440917 +0000 UTC m=+241.105373456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.109416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.110319 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.610303031 +0000 UTC m=+241.208235560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.158920 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.160372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.164341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.214708 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.714684661 +0000 UTC m=+241.312617190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.214575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.215001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.215493 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.715486385 +0000 UTC m=+241.313418914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.263326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.330764 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.338771 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.83873075 +0000 UTC m=+241.436663279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339511 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.340165 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.840156623 +0000 UTC m=+241.438089152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.378003 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.397899 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.400934 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m66xn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.401419 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podUID="c5e0183e-e0f5-4b89-a2f9-27fc07783e27" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.402257 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.420351 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.441969 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.941939384 +0000 UTC m=+241.539871913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.441977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.443273 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.943255485 +0000 UTC m=+241.541188014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.443322 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.443397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.489150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"73232c6917a3217823fbedbea721a4e33d2ce945f5dbad74eb27d3fb0c436a58"} Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.494016 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"ae5b7ba16724b9e103a5e8ab55de8a2abf69a222f5e75301c5843aab39aee6df"} Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.495030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.502477 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.514602 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-229g6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.514986 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544301 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544544 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2rw7x" podStartSLOduration=9.5445247 podStartE2EDuration="9.5445247s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:21.541810097 +0000 UTC m=+241.139742626" watchObservedRunningTime="2026-03-20 13:34:21.5445247 +0000 UTC m=+241.142457219" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544699 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544806 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.545275 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.045253622 +0000 UTC m=+241.643186151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.567843 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.568982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.588884 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.602337 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.637990 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:21 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:21 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:21 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.638052 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.654935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.655986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656742 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.663571 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.163545776 +0000 UTC m=+241.761478535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.665953 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.666229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.691829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.699953 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766622 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.767100 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.26708308 +0000 UTC m=+241.865015609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.767495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.767854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.796459 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.796861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.797572 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.803919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.830046 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.885931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.886316 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.386299723 +0000 UTC m=+241.984232252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.936084 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.988924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.989540 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.489518498 +0000 UTC m=+242.087451027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.091619 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.092171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.092463 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.592451233 +0000 UTC m=+242.190383762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.097838 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.144097 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.149611 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.191518 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.192233 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.692210753 +0000 UTC m=+242.290143292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.193517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.300346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.300856 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.800843223 +0000 UTC m=+242.398775742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.408477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.409606 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.909584076 +0000 UTC m=+242.507516605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.422148 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.422751 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" containerID="cri-o://621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" gracePeriod=30 Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.444228 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.444637 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" containerID="cri-o://f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" gracePeriod=30 Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.512930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.513376 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.013361127 +0000 UTC m=+242.611293656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.581332 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.603095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerStarted","Data":"0ab76dafe853da1151a253ddbccefd2f71d9bf47c5abfc10da67278b7f81253e"} Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.614411 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.614916 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.114894621 +0000 UTC m=+242.712827140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.633615 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:22 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:22 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:22 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.633714 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.694259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.718803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.719207 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.219191218 +0000 UTC m=+242.817123747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.821310 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.821859 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.321838506 +0000 UTC m=+242.919771045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.923374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.923746 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.42372857 +0000 UTC m=+243.021661099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.995487 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.996511 4755 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-g4ftg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.996575 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.010927 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.024141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.024711 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.524691755 +0000 UTC m=+243.122624284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.125819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.126350 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.626329893 +0000 UTC m=+243.224262422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.141228 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.157413 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157429 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157561 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.158539 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.163355 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.175892 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227451 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227608 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227800 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229153 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229198 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config" (OuterVolumeSpecName: "config") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.229501 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.729304039 +0000 UTC m=+243.327236748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.244121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7" (OuterVolumeSpecName: "kube-api-access-tglm7") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "kube-api-access-tglm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.254179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.331747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332244 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332409 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332414 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332424 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332663 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332683 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332697 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.333098 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.833079231 +0000 UTC m=+243.431011970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.334196 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.336004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.351802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:23 crc kubenswrapper[4755]: W0320 13:34:23.394805 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184aa529_45c4_42c9_8eee_04bd18fba718.slice/crio-24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509 WatchSource:0}: Error finding container 24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509: Status 404 returned error can't find the container with id 24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.422218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.438896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439439 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.441840 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.941802534 +0000 UTC m=+243.539735063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.442497 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config" (OuterVolumeSpecName: "config") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.442647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.446465 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd" (OuterVolumeSpecName: "kube-api-access-zdrkd") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "kube-api-access-zdrkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.446876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.488200 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.553968 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554205 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554222 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554278 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554293 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.554771 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.054728044 +0000 UTC m=+243.652660573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.558334 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.559071 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.560159 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.574082 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.576802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.585458 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.636299 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:23 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:23 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:23 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.636370 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642814 4755 generic.go:334] "Generic (PLEG): container finished" podID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642899 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerDied","Data":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerDied","Data":"ea03e21c825372e4f508e4183f07bab9440aa36d8af7963578ed0bad5bcf3f8f"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642948 4755 scope.go:117] "RemoveContainer" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.643115 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655591 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.655767 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.155737782 +0000 UTC m=+243.753670311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.656370 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.156353501 +0000 UTC m=+243.754286040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661477 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"28734b0d2914118b3d9d2819be5a8fd3a2768be1a04f071ed6cc45a5baf248f6"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.692600 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.692805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699069 4755 generic.go:334] "Generic (PLEG): container finished" podID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerDied","Data":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699207 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerDied","Data":"3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699305 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.710514 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.715554 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.719678 4755 scope.go:117] "RemoveContainer" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.722608 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": container with ID starting with 621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4 not found: ID does not exist" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.723023 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} err="failed to get container status \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": rpc error: code = NotFound desc = could not find container \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": container with ID starting with 621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4 not found: ID does not exist" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.723061 4755 scope.go:117] "RemoveContainer" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750321 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerStarted","Data":"24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.757794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.759441 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.259422031 +0000 UTC m=+243.857354560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.759847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.760558 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.762512 4755 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785008 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785176 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerStarted","Data":"b82219efa86cff3e92cd1609c0f3a02dacbb886afd0558266c139f378ee30512"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.790718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.799520 4755 scope.go:117] "RemoveContainer" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.799890 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": container with ID starting with f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf not found: ID does not exist" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.799926 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} err="failed to get container status \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": rpc error: code = NotFound desc = could not find container \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": container with ID starting with f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf not found: ID does not exist" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.800900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"62fcbeafe490ddef0d169e1b68e282bff334419b04ae05d301d4286b25cdfb58"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.804931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.807577 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.861696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.864883 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.364866203 +0000 UTC m=+243.962798742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.904114 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: W0320 13:34:23.935684 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d2017d2_f4ee_4056_b350_cc313f3faeaf.slice/crio-3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27 WatchSource:0}: Error finding container 3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27: Status 404 returned error can't find the container with id 3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.954165 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.963049 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.963508 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.463486027 +0000 UTC m=+244.061418556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.979035 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pz64x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: i/o timeout" start-of-body= Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.979217 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: i/o timeout" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:23.997844 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:23.999458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.009851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.010135 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.019135 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026794 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026866 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026803 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.027242 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.064903 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.065053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.065133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.065617 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.565603369 +0000 UTC m=+244.163535898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.167173 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.167383 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.667349019 +0000 UTC m=+244.265281548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.168084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.170436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.170801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.170858 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.670847855 +0000 UTC m=+244.268780384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.171280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.199341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.273895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.274259 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.774241745 +0000 UTC m=+244.372174274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.350801 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.353507 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361830 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361865 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361933 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.362234 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.362746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.363744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.366374 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.367475 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.370069 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.370678 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.371010 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.371318 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372732 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372861 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.375272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.375784 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.875764209 +0000 UTC m=+244.473696738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.382678 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.398724 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.468557 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.476247 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.976201878 +0000 UTC m=+244.574134407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477194 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477493 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.478077 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.978061975 +0000 UTC m=+244.575994504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: W0320 13:34:24.478098 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4d5763_1786_4b87_8497_0c65da46f446.slice/crio-bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e WatchSource:0}: Error finding container bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e: Status 404 returned error can't find the container with id bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.499719 4755 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:34:23.762541456Z","Handler":null,"Name":""} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.505067 4755 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.505133 4755 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.544924 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.546076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.552063 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.555168 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.578870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579189 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.582038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.583537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.583995 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.584903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.585174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589531 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589939 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.611278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.621077 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.629100 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:24 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:24 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:24 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.629168 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682481 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.683102 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.685976 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.686842 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.686918 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.735988 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.745661 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784660 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.785066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.785174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.815603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.852018 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.852372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.854961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerStarted","Data":"bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878104 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerStarted","Data":"3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.886441 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.928067 4755 generic.go:334] "Generic (PLEG): container finished" podID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerID="c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.928431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerDied","Data":"c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.951545 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.979530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.979855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.980026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.990406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.008819 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.010224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.018830 4755 patch_prober.go:28] interesting pod/console-f9d7485db-rb5zn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.020214 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rb5zn" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.020929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"882b5ec0574afffeb68d87ce1070f73838a10020e187a200742440e96eb7d45d"} Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.021086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"06d905723e36898a8fe0fb2068c3547b72ddb1a94c670b4c5b7f5f3d14d9b16e"} Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.043708 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.069217 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" podStartSLOduration=13.069187405 podStartE2EDuration="13.069187405s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:25.05623673 +0000 UTC m=+244.654169249" watchObservedRunningTime="2026-03-20 13:34:25.069187405 +0000 UTC m=+244.667119934" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.093636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.094071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.094292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.128720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196877 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.197562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.198466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: W0320 13:34:25.210081 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798ec963_27eb_429b_8cbd_310fbf41feb2.slice/crio-29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28 WatchSource:0}: Error finding container 29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28: Status 404 returned error can't find the container with id 29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28 Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.229872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.233636 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54304: no serving certificate available for the kubelet" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.275900 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" path="/var/lib/kubelet/pods/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.277516 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.278401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" path="/var/lib/kubelet/pods/d0eef306-2a08-40d1-82cf-ad6d81923c67/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.329620 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.388775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.476452 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.482415 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.482532 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.484697 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.484747 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.507038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.512522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.512609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.521215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:25 crc kubenswrapper[4755]: W0320 13:34:25.608634 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408c6869_42d8_4cbc_a261_57fb45f0d666.slice/crio-500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6 WatchSource:0}: Error finding container 500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6: Status 404 returned error can't find the container with id 500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6 Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.619677 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.620251 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.620546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.631563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.637278 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:25 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:25 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:25 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.637347 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.641030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.823585 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.983209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.076872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerStarted","Data":"a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.112419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerStarted","Data":"500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.141338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerStarted","Data":"67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.141410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerStarted","Data":"29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.142702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.170019 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.179009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"ea85ece18daec304b9cecefa9ca55b3c7ddbfc128e021ebc4bfd2b1a692b4346"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.201923 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podStartSLOduration=4.201905245 podStartE2EDuration="4.201905245s" podCreationTimestamp="2026-03-20 13:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:26.200128461 +0000 UTC m=+245.798060990" watchObservedRunningTime="2026-03-20 13:34:26.201905245 +0000 UTC m=+245.799837774" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.214974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerStarted","Data":"1f5e663be2a59aec380b38807a322f3980150870b6f5114ac3d543094b13a3ea"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.278420 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54306: no serving certificate available for the kubelet" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.340633 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:26 crc kubenswrapper[4755]: W0320 13:34:26.378974 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod90e64094_373f_4a6b_ad6d_a68096ece17d.slice/crio-484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768 WatchSource:0}: Error finding container 484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768: Status 404 returned error can't find the container with id 484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768 Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630095 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630555 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:26 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:26 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:26 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630589 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.733630 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.745831 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.745982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.746056 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.748028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.760722 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.762552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz" (OuterVolumeSpecName: "kube-api-access-mm6mz") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "kube-api-access-mm6mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.847994 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.848028 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.848050 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.152495 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.156154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.187156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268081 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268122 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerStarted","Data":"484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerStarted","Data":"f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.270976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerStarted","Data":"a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.275013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerStarted","Data":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.275846 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.279959 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.280056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerDied","Data":"d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.280130 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.282188 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podStartSLOduration=5.282168217 podStartE2EDuration="5.282168217s" podCreationTimestamp="2026-03-20 13:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.280379493 +0000 UTC m=+246.878312022" watchObservedRunningTime="2026-03-20 13:34:27.282168217 +0000 UTC m=+246.880100746" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294500 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c" exitCode=0 Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"e8b20b8055283611079980efbe691798926e8e8d967c03c6c2d40a174aa03339"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.304747 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" exitCode=0 Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.305069 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.305718 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.307327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.314581 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.314599 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" podStartSLOduration=175.314566075 podStartE2EDuration="2m55.314566075s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.308926303 +0000 UTC m=+246.906858852" watchObservedRunningTime="2026-03-20 13:34:27.314566075 +0000 UTC m=+246.912498604" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.336291 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.336263385 podStartE2EDuration="4.336263385s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.330427257 +0000 UTC m=+246.928359786" watchObservedRunningTime="2026-03-20 13:34:27.336263385 +0000 UTC m=+246.934195914" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.632028 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:27 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:27 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:27 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.632486 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.962364 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.329938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerStarted","Data":"632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.340677 4755 generic.go:334] "Generic (PLEG): container finished" podID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerID="a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597" exitCode=0 Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.340871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerDied","Data":"a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.351241 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.351213408 podStartE2EDuration="3.351213408s" podCreationTimestamp="2026-03-20 13:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:28.347192405 +0000 UTC m=+247.945124954" watchObservedRunningTime="2026-03-20 13:34:28.351213408 +0000 UTC m=+247.949145947" Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.370481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"fa60a902afe6f653701116f8903fb4077b7003ddd2d261f9fa3d07443b01a9b4"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.629626 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.635087 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.382562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerDied","Data":"632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c"} Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.392040 4755 generic.go:334] "Generic (PLEG): container finished" podID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerID="632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.403122 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"20415f7e779e5387c6e3dd577beaa0b2a4b0b1363e9385777ac24a07168446d6"} Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.829058 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.936924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"b86b19d6-a389-4b02-b514-f828f685b7fc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.936992 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"b86b19d6-a389-4b02-b514-f828f685b7fc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.937124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b86b19d6-a389-4b02-b514-f828f685b7fc" (UID: "b86b19d6-a389-4b02-b514-f828f685b7fc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.937436 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.980355 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b86b19d6-a389-4b02-b514-f828f685b7fc" (UID: "b86b19d6-a389-4b02-b514-f828f685b7fc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.039140 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerDied","Data":"a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855"} Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427563 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427579 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.432625 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"c573931a211d01b148c78ade59f393c00ec55ea160bfcd9a82c8214167e55ae2"} Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.455699 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpm42" podStartSLOduration=178.455670554 podStartE2EDuration="2m58.455670554s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:30.44635397 +0000 UTC m=+250.044286499" watchObservedRunningTime="2026-03-20 13:34:30.455670554 +0000 UTC m=+250.053603083" Mar 20 13:34:31 crc kubenswrapper[4755]: I0320 13:34:31.098426 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:34 crc kubenswrapper[4755]: I0320 13:34:34.032196 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.026178 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.030079 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.512864 4755 ???:1] "http: TLS handshake error from 192.168.126.11:49074: no serving certificate available for the kubelet" Mar 20 13:34:36 crc kubenswrapper[4755]: I0320 13:34:36.751419 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:36 crc kubenswrapper[4755]: I0320 13:34:36.751560 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.093238 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"90e64094-373f-4a6b-ad6d-a68096ece17d\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165411 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90e64094-373f-4a6b-ad6d-a68096ece17d" (UID: "90e64094-373f-4a6b-ad6d-a68096ece17d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"90e64094-373f-4a6b-ad6d-a68096ece17d\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.168002 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.181224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90e64094-373f-4a6b-ad6d-a68096ece17d" (UID: "90e64094-373f-4a6b-ad6d-a68096ece17d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.271162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerDied","Data":"484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768"} Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571255 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571325 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.042390 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.042665 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" containerID="cri-o://f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" gracePeriod=30 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.063326 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.063642 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" containerID="cri-o://67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" gracePeriod=30 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.583336 4755 generic.go:334] "Generic (PLEG): container finished" podID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerID="67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" exitCode=0 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.583444 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerDied","Data":"67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327"} Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.585789 4755 generic.go:334] "Generic (PLEG): container finished" podID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerID="f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" exitCode=0 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.585840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerDied","Data":"f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f"} Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.686592 4755 patch_prober.go:28] interesting pod/route-controller-manager-589b99697b-vh78j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.687277 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.747376 4755 patch_prober.go:28] interesting pod/controller-manager-5db74bc9fd-wp4mj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.747933 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.997433 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.685944 4755 patch_prober.go:28] interesting pod/route-controller-manager-589b99697b-vh78j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.687129 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.746619 4755 patch_prober.go:28] interesting pod/controller-manager-5db74bc9fd-wp4mj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.746807 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.888575 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.123397 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage136669660/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.123639 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgvrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lkkl5_openshift-marketplace(e9ec78bf-3afe-49d9-983a-99645840cecb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage136669660/2\": happened during read: context canceled" logger="UnhandledError" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.124983 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage136669660/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.599744 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600353 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600371 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600383 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600391 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600406 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600411 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600521 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600533 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600543 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.605278 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.605736 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.610231 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.638415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.638482 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.739678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.739767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.740628 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.763272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.947487 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.047419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.090227 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.090834 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:34:57 crc kubenswrapper[4755]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:34:57 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xqgfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566892-xh9lg_openshift-infra(28deea0d-d80e-422b-a0c2-40670570aa68): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:34:57 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.092390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.102615 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.109369 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.110406 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.110512 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:34:57 crc kubenswrapper[4755]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:34:57 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7bc8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566894-tzlc5_openshift-infra(d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:34:57 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.111776 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141061 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.141374 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141389 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.141407 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141416 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141540 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146511 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146530 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151284 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151435 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.152867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca" (OuterVolumeSpecName: "client-ca") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.153015 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config" (OuterVolumeSpecName: "config") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.155011 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.155116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config" (OuterVolumeSpecName: "config") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.166583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.166739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq" (OuterVolumeSpecName: "kube-api-access-jmhjq") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "kube-api-access-jmhjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.167191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.168674 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m" (OuterVolumeSpecName: "kube-api-access-hj85m") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "kube-api-access-hj85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248806 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248821 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248833 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248891 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248902 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248911 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248924 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248938 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.350249 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.352521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.352818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.355060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.369276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.507877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.681726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerDied","Data":"1f5e663be2a59aec380b38807a322f3980150870b6f5114ac3d543094b13a3ea"} Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.682102 4755 scope.go:117] "RemoveContainer" containerID="f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.681786 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.684145 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.684289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerDied","Data":"29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28"} Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.686042 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.686373 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.709172 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.713090 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.752740 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.754055 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.239877 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" path="/var/lib/kubelet/pods/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec/volumes" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.240440 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" path="/var/lib/kubelet/pods/798ec963-27eb-429b-8cbd-310fbf41feb2/volumes" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.386527 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.388541 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.391959 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.392560 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393030 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393749 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393797 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.395533 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.402494 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.404004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.482944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483132 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584194 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584315 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586532 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.590339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.601113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.709152 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.853493 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.854280 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsjvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-929x7_openshift-marketplace(2d2017d2-f4ee-4056-b350-cc313f3faeaf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.856016 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.008620 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.098336 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.388757 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.389988 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.410334 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.426885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.426984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.427171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528363 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528480 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.550827 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.724629 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.636594 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.712566 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.712793 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2rrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lkvql_openshift-marketplace(887fa242-bd5e-40f5-8f6e-a81c6e976322): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.714056 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" Mar 20 13:35:06 crc kubenswrapper[4755]: I0320 13:35:06.751687 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:06 crc kubenswrapper[4755]: I0320 13:35:06.751851 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.238079 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.303135 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.303526 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hchb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-shzbw_openshift-marketplace(2db67acd-25db-47a7-80ea-da4065a60e23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.306465 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.687482 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" Mar 20 13:35:08 crc kubenswrapper[4755]: I0320 13:35:08.715372 4755 scope.go:117] "RemoveContainer" containerID="67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.788584 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.788826 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfxf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vm24m_openshift-marketplace(184aa529-45c4-42c9-8eee-04bd18fba718): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.790267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.809885 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.810285 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nblfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nlslg_openshift-marketplace(ce4d5763-1786-4b87-8497-0c65da46f446): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.812869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.829345 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.829558 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8rpkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-d8rq7_openshift-marketplace(a751ac46-3f89-4d5a-8a23-0bbb3584dfa0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.831227 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.900964 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.901174 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5qg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cgznb_openshift-marketplace(e8e34571-6648-4e5e-b3e9-05f87454e19a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.902947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.161195 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.165739 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.170396 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9201c8_bf96_460a_88c0_d37ed74be3b8.slice/crio-48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593 WatchSource:0}: Error finding container 48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593: Status 404 returned error can't find the container with id 48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593 Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.171321 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9af4f4b_5318_4e19_948a_c976effb4bde.slice/crio-7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22 WatchSource:0}: Error finding container 7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22: Status 404 returned error can't find the container with id 7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22 Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.239089 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc0668fdb_be01_431d_9cbb_dabae6eb44e1.slice/crio-ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e WatchSource:0}: Error finding container ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e: Status 404 returned error can't find the container with id ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.239514 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1634f58c_17b2_4fbe_b668_c0b386e97ee8.slice/crio-7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53 WatchSource:0}: Error finding container 7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53: Status 404 returned error can't find the container with id 7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.243956 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.244004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.763592 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerStarted","Data":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.764166 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.764183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerStarted","Data":"7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.763771 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" containerID="cri-o://ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" gracePeriod=30 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerStarted","Data":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771682 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerStarted","Data":"48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771826 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" containerID="cri-o://efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" gracePeriod=30 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.772222 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.775327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerStarted","Data":"fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.775375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerStarted","Data":"ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.781292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerStarted","Data":"51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.781346 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerStarted","Data":"7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22"} Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.783924 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.784571 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.784711 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.785212 4755 patch_prober.go:28] interesting pod/controller-manager-d494d75f7-ckcts container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": EOF" start-of-body= Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.785293 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": EOF" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.788213 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.794550 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podStartSLOduration=27.794523888 podStartE2EDuration="27.794523888s" podCreationTimestamp="2026-03-20 13:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.789426684 +0000 UTC m=+289.387359223" watchObservedRunningTime="2026-03-20 13:35:09.794523888 +0000 UTC m=+289.392456417" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.862234 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.862194761 podStartE2EDuration="7.862194761s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.86217891 +0000 UTC m=+289.460111449" watchObservedRunningTime="2026-03-20 13:35:09.862194761 +0000 UTC m=+289.460127290" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.897364 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.897332061 podStartE2EDuration="13.897332061s" podCreationTimestamp="2026-03-20 13:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.892928517 +0000 UTC m=+289.490861046" watchObservedRunningTime="2026-03-20 13:35:09.897332061 +0000 UTC m=+289.495264590" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.920755 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podStartSLOduration=27.920646471 podStartE2EDuration="27.920646471s" podCreationTimestamp="2026-03-20 13:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.914579926 +0000 UTC m=+289.512512455" watchObservedRunningTime="2026-03-20 13:35:09.920646471 +0000 UTC m=+289.518579010" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.167258 4755 patch_prober.go:28] interesting pod/route-controller-manager-7788c68c6d-l8hxg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:50880->10.217.0.60:8443: read: connection reset by peer" start-of-body= Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.167331 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:50880->10.217.0.60:8443: read: connection reset by peer" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.295436 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.329449 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.329844 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.329862 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.330002 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.330555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.340952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370388 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370583 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370907 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372078 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372086 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372596 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config" (OuterVolumeSpecName: "config") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.378090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.378716 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6" (OuterVolumeSpecName: "kube-api-access-5h5h6") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "kube-api-access-5h5h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.446772 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7788c68c6d-l8hxg_ec9201c8-bf96-460a-88c0-d37ed74be3b8/route-controller-manager/0.log" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.446870 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472216 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472309 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472834 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472989 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473004 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473018 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473032 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473044 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.474274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config" (OuterVolumeSpecName: "config") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.474347 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.475613 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.477056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.477850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx" (OuterVolumeSpecName: "kube-api-access-lrdlx") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "kube-api-access-lrdlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.479005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.481474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.482515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.497244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574314 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574355 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574366 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574376 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.653916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.719359 4755 csr.go:261] certificate signing request csr-8z59g is approved, waiting to be issued Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.739838 4755 csr.go:257] certificate signing request csr-8z59g is issued Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.789397 4755 generic.go:334] "Generic (PLEG): container finished" podID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" exitCode=0 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.789530 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerDied","Data":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerDied","Data":"7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790315 4755 scope.go:117] "RemoveContainer" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794589 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7788c68c6d-l8hxg_ec9201c8-bf96-460a-88c0-d37ed74be3b8/route-controller-manager/0.log" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794639 4755 generic.go:334] "Generic (PLEG): container finished" podID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" exitCode=255 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerDied","Data":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerDied","Data":"48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794902 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.806309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerStarted","Data":"6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.823238 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podStartSLOduration=18.20812138 podStartE2EDuration="1m10.82321528s" podCreationTimestamp="2026-03-20 13:34:00 +0000 UTC" firstStartedPulling="2026-03-20 13:34:17.36785724 +0000 UTC m=+236.965789769" lastFinishedPulling="2026-03-20 13:35:09.98295115 +0000 UTC m=+289.580883669" observedRunningTime="2026-03-20 13:35:10.821376243 +0000 UTC m=+290.419308772" watchObservedRunningTime="2026-03-20 13:35:10.82321528 +0000 UTC m=+290.421147809" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.825869 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerID="51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee" exitCode=0 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.825926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerDied","Data":"51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.863578 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.869825 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.871828 4755 scope.go:117] "RemoveContainer" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.872383 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": container with ID starting with ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9 not found: ID does not exist" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.872435 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} err="failed to get container status \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": rpc error: code = NotFound desc = could not find container \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": container with ID starting with ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9 not found: ID does not exist" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.872473 4755 scope.go:117] "RemoveContainer" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.873090 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.875756 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.878180 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: W0320 13:35:10.885771 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48c0ce9_15b4_48fc_b6f8_bbd69d45e6bc.slice/crio-9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3 WatchSource:0}: Error finding container 9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3: Status 404 returned error can't find the container with id 9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.891442 4755 scope.go:117] "RemoveContainer" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.892394 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": container with ID starting with efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8 not found: ID does not exist" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.892427 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} err="failed to get container status \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": rpc error: code = NotFound desc = could not find container \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": container with ID starting with efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8 not found: ID does not exist" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.234907 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" path="/var/lib/kubelet/pods/1634f58c-17b2-4fbe-b668-c0b386e97ee8/volumes" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.235727 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" path="/var/lib/kubelet/pods/ec9201c8-bf96-460a-88c0-d37ed74be3b8/volumes" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.745362 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 13:07:04.189884043 +0000 UTC Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.745426 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6887h31m52.444460957s for next certificate rotation Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.836276 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerID="6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9" exitCode=0 Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.836356 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerDied","Data":"6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.840038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerStarted","Data":"d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.840077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerStarted","Data":"9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.841198 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.847403 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.871569 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" podStartSLOduration=9.871543989 podStartE2EDuration="9.871543989s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:11.868234178 +0000 UTC m=+291.466166707" watchObservedRunningTime="2026-03-20 13:35:11.871543989 +0000 UTC m=+291.469476518" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.092690 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.096473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"b9af4f4b-5318-4e19-948a-c976effb4bde\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.096613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b9af4f4b-5318-4e19-948a-c976effb4bde" (UID: "b9af4f4b-5318-4e19-948a-c976effb4bde"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.097295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"b9af4f4b-5318-4e19-948a-c976effb4bde\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.098591 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.108445 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b9af4f4b-5318-4e19-948a-c976effb4bde" (UID: "b9af4f4b-5318-4e19-948a-c976effb4bde"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.200504 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393064 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:12 crc kubenswrapper[4755]: E0320 13:35:12.393317 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393331 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: E0320 13:35:12.393351 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393358 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393463 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393478 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397502 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397522 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397904 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397933 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404717 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.409443 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.508018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.508032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.511069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.523511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.717737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.746268 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-03 21:31:18.776131332 +0000 UTC Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.746315 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6943h56m6.029820612s for next certificate rotation Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerDied","Data":"7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22"} Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875792 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875499 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.982553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:13 crc kubenswrapper[4755]: W0320 13:35:13.001531 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dedcffe_6047_46bb_9970_eed6e7dfcd2a.slice/crio-f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa WatchSource:0}: Error finding container f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa: Status 404 returned error can't find the container with id f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.315288 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.445607 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.452381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c" (OuterVolumeSpecName: "kube-api-access-7bc8c") pod "d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" (UID: "d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34"). InnerVolumeSpecName "kube-api-access-7bc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.546998 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.883840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerStarted","Data":"c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.883913 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerStarted","Data":"f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.884122 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.887357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerStarted","Data":"5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.890027 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.891442 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892355 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerDied","Data":"87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892394 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.922355 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podStartSLOduration=11.922330509 podStartE2EDuration="11.922330509s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:13.90529192 +0000 UTC m=+293.503224449" watchObservedRunningTime="2026-03-20 13:35:13.922330509 +0000 UTC m=+293.520263038" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.924300 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podStartSLOduration=138.055234373 podStartE2EDuration="3m13.92429387s" podCreationTimestamp="2026-03-20 13:32:00 +0000 UTC" firstStartedPulling="2026-03-20 13:34:17.441558616 +0000 UTC m=+237.039491145" lastFinishedPulling="2026-03-20 13:35:13.310618113 +0000 UTC m=+292.908550642" observedRunningTime="2026-03-20 13:35:13.922490044 +0000 UTC m=+293.520422563" watchObservedRunningTime="2026-03-20 13:35:13.92429387 +0000 UTC m=+293.522226389" Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.900921 4755 generic.go:334] "Generic (PLEG): container finished" podID="28deea0d-d80e-422b-a0c2-40670570aa68" containerID="5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a" exitCode=0 Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.901024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerDied","Data":"5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a"} Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.905127 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a" exitCode=0 Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.905445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a"} Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.193887 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.294083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"28deea0d-d80e-422b-a0c2-40670570aa68\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.301561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr" (OuterVolumeSpecName: "kube-api-access-xqgfr") pod "28deea0d-d80e-422b-a0c2-40670570aa68" (UID: "28deea0d-d80e-422b-a0c2-40670570aa68"). InnerVolumeSpecName "kube-api-access-xqgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.395937 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.918737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerDied","Data":"d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4"} Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.919621 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.918820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.031028 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.032827 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" containerID="cri-o://d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" gracePeriod=30 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.058276 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.058549 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" containerID="cri-o://c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" gracePeriod=30 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.719541 4755 patch_prober.go:28] interesting pod/route-controller-manager-ff8dbcb57-cxhxw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.719643 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.968442 4755 generic.go:334] "Generic (PLEG): container finished" podID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerID="c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" exitCode=0 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.968611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerDied","Data":"c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a"} Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.970753 4755 generic.go:334] "Generic (PLEG): container finished" podID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerID="d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" exitCode=0 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.970810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerDied","Data":"d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45"} Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.982709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09"} Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.009861 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lkkl5" podStartSLOduration=4.596072378 podStartE2EDuration="59.009834337s" podCreationTimestamp="2026-03-20 13:34:24 +0000 UTC" firstStartedPulling="2026-03-20 13:34:27.304070845 +0000 UTC m=+246.902003374" lastFinishedPulling="2026-03-20 13:35:21.717832804 +0000 UTC m=+301.315765333" observedRunningTime="2026-03-20 13:35:23.006929138 +0000 UTC m=+302.604861697" watchObservedRunningTime="2026-03-20 13:35:23.009834337 +0000 UTC m=+302.607766866" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.908649 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919956 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.920111 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.920141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.921962 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.922955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.923525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config" (OuterVolumeSpecName: "config") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.937401 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv" (OuterVolumeSpecName: "kube-api-access-7fsjv") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "kube-api-access-7fsjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.952873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962523 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.962839 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962856 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.962869 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962877 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.963644 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963681 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963807 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963827 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963841 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.964263 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.964369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerDied","Data":"9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3"} Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999307 4755 scope.go:117] "RemoveContainer" containerID="d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999529 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.019329 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021560 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021687 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021701 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021713 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021724 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021733 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.072258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.077615 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122297 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca" (OuterVolumeSpecName: "client-ca") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124420 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config" (OuterVolumeSpecName: "config") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124644 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124754 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124871 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.125013 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.125039 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.126347 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.126612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.127585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.130379 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc" (OuterVolumeSpecName: "kube-api-access-2vckc") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "kube-api-access-2vckc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.131834 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.133117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.147581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.225900 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.225944 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.315296 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.749941 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.010483 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.010608 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.015352 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.015379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.018001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerStarted","Data":"dc837d19aecb0eeba01a794632cec0b82547d105d101892ba220de3315f6f008"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerDied","Data":"f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021583 4755 scope.go:117] "RemoveContainer" containerID="c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021740 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.031271 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.031341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.124672 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.127532 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.234387 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" path="/var/lib/kubelet/pods/2dedcffe-6047-46bb-9970-eed6e7dfcd2a/volumes" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.235450 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" path="/var/lib/kubelet/pods/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc/volumes" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.331292 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.331539 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.042673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerStarted","Data":"dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c"} Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.045005 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.049582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.068010 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" podStartSLOduration=4.067987228 podStartE2EDuration="4.067987228s" podCreationTimestamp="2026-03-20 13:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:26.063631326 +0000 UTC m=+305.661563895" watchObservedRunningTime="2026-03-20 13:35:26.067987228 +0000 UTC m=+305.665919797" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411020 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:26 crc kubenswrapper[4755]: E0320 13:35:26.411319 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411337 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411471 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.412001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416279 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416407 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416443 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416536 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.417449 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.418084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.433894 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.565061 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.565309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.571717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.581026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.655043 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" probeResult="failure" output=< Mar 20 13:35:26 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:35:26 crc kubenswrapper[4755]: > Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.781026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:27 crc kubenswrapper[4755]: I0320 13:35:27.051780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.060407 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" exitCode=0 Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.060545 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.858381 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:29 crc kubenswrapper[4755]: I0320 13:35:29.070726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerStarted","Data":"3132c3da4cf259af8b45987aa5d639d9329cf6c2181feff72111c4f02f43d042"} Mar 20 13:35:33 crc kubenswrapper[4755]: I0320 13:35:33.108040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerStarted","Data":"b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9"} Mar 20 13:35:33 crc kubenswrapper[4755]: I0320 13:35:33.112835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerStarted","Data":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.119307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.125518 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.145736 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" podStartSLOduration=12.145711821 podStartE2EDuration="12.145711821s" podCreationTimestamp="2026-03-20 13:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:34.145238987 +0000 UTC m=+313.743171546" watchObservedRunningTime="2026-03-20 13:35:34.145711821 +0000 UTC m=+313.743644390" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.177139 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shzbw" podStartSLOduration=8.446711804 podStartE2EDuration="1m13.177103517s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.719792453 +0000 UTC m=+243.317724982" lastFinishedPulling="2026-03-20 13:35:28.450184156 +0000 UTC m=+308.048116695" observedRunningTime="2026-03-20 13:35:34.17127894 +0000 UTC m=+313.769211489" watchObservedRunningTime="2026-03-20 13:35:34.177103517 +0000 UTC m=+313.775036056" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.441162 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.514543 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.690388 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.136252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerStarted","Data":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.138099 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" exitCode=0 Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.138955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19"} Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751805 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751904 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751966 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.752626 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.752715 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" gracePeriod=600 Mar 20 13:35:37 crc kubenswrapper[4755]: I0320 13:35:37.153876 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" containerID="cri-o://c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" gracePeriod=2 Mar 20 13:35:37 crc kubenswrapper[4755]: I0320 13:35:37.215027 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-929x7" podStartSLOduration=8.438647583 podStartE2EDuration="1m14.215005203s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="2026-03-20 13:34:24.916687919 +0000 UTC m=+244.514620448" lastFinishedPulling="2026-03-20 13:35:30.693045489 +0000 UTC m=+310.290978068" observedRunningTime="2026-03-20 13:35:37.211254488 +0000 UTC m=+316.809187027" watchObservedRunningTime="2026-03-20 13:35:37.215005203 +0000 UTC m=+316.812937742" Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.162819 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" exitCode=0 Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.162934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.166261 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" exitCode=0 Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.166290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.176127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerStarted","Data":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.179562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.626013 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.786899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.787133 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.787208 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.788064 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities" (OuterVolumeSpecName: "utilities") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.794766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm" (OuterVolumeSpecName: "kube-api-access-hgvrm") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "kube-api-access-hgvrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.888710 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.888748 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.937232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.990081 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"e8b20b8055283611079980efbe691798926e8e8d967c03c6c2d40a174aa03339"} Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197911 4755 scope.go:117] "RemoveContainer" containerID="c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.205584 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" exitCode=0 Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.205702 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.232681 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vm24m" podStartSLOduration=5.21447221 podStartE2EDuration="1m19.23262184s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.779001758 +0000 UTC m=+243.376934287" lastFinishedPulling="2026-03-20 13:35:37.797151338 +0000 UTC m=+317.395083917" observedRunningTime="2026-03-20 13:35:40.227920286 +0000 UTC m=+319.825852855" watchObservedRunningTime="2026-03-20 13:35:40.23262184 +0000 UTC m=+319.830554399" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.274623 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.279200 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.233000 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" path="/var/lib/kubelet/pods/e9ec78bf-3afe-49d9-983a-99645840cecb/volumes" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.303957 4755 scope.go:117] "RemoveContainer" containerID="ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.504826 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.504902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.568595 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.672300 4755 scope.go:117] "RemoveContainer" containerID="3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.040543 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.040869 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" containerID="cri-o://dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" gracePeriod=30 Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.132138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.132380 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" containerID="cri-o://b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" gracePeriod=30 Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.150563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.150756 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.206150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.268970 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.242914 4755 generic.go:334] "Generic (PLEG): container finished" podID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerID="b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" exitCode=0 Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.243049 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerDied","Data":"b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9"} Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.247909 4755 generic.go:334] "Generic (PLEG): container finished" podID="400323f5-babd-4943-a66e-515ee2b59889" containerID="dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" exitCode=0 Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.248137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerDied","Data":"dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c"} Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.488617 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.488718 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.551099 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.799910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.844075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.857572 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858103 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858124 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858155 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858186 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858194 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858207 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-utilities" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858238 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-utilities" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858257 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-content" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858264 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-content" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858405 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858421 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858433 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.863497 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.863610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876638 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877144 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877313 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.878196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca" (OuterVolumeSpecName: "client-ca") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.879188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.879816 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.880124 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.880795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.881909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config" (OuterVolumeSpecName: "config") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.886467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.886865 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config" (OuterVolumeSpecName: "config") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.895729 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.895860 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz" (OuterVolumeSpecName: "kube-api-access-8d5qz") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "kube-api-access-8d5qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.899843 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9" (OuterVolumeSpecName: "kube-api-access-d68f9") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "kube-api-access-d68f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.901710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.981849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983087 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983165 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983239 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983307 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983372 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983459 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983534 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983962 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.987217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.987914 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.994603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.001002 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.192481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.267376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerStarted","Data":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.279921 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.295483 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d8rq7" podStartSLOduration=3.429581004 podStartE2EDuration="1m23.295458996s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.799376768 +0000 UTC m=+243.397309297" lastFinishedPulling="2026-03-20 13:35:43.66525476 +0000 UTC m=+323.263187289" observedRunningTime="2026-03-20 13:35:44.293631734 +0000 UTC m=+323.891564293" watchObservedRunningTime="2026-03-20 13:35:44.295458996 +0000 UTC m=+323.893391525" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.302238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.338494 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" exitCode=0 Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.338607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.341083 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lkvql" podStartSLOduration=6.3537257910000005 podStartE2EDuration="1m20.341059837s" podCreationTimestamp="2026-03-20 13:34:24 +0000 UTC" firstStartedPulling="2026-03-20 13:34:27.316750481 +0000 UTC m=+246.914683010" lastFinishedPulling="2026-03-20 13:35:41.304084487 +0000 UTC m=+320.902017056" observedRunningTime="2026-03-20 13:35:44.340255114 +0000 UTC m=+323.938187653" watchObservedRunningTime="2026-03-20 13:35:44.341059837 +0000 UTC m=+323.938992356" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.349540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerDied","Data":"3132c3da4cf259af8b45987aa5d639d9329cf6c2181feff72111c4f02f43d042"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351179 4755 scope.go:117] "RemoveContainer" containerID="b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351317 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.362546 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.363002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerDied","Data":"dc837d19aecb0eeba01a794632cec0b82547d105d101892ba220de3315f6f008"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.406292 4755 scope.go:117] "RemoveContainer" containerID="dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.417791 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.422335 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.425006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.430457 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.454024 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.687957 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.887750 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.887842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.232375 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400323f5-babd-4943-a66e-515ee2b59889" path="/var/lib/kubelet/pods/400323f5-babd-4943-a66e-515ee2b59889/volumes" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.233569 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" path="/var/lib/kubelet/pods/af80897c-3c59-4376-b0f0-15d862d4b7d5/volumes" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.234303 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.371880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" event={"ID":"f2857fcc-84c2-42ab-81d7-5e430db9cfba","Type":"ContainerStarted","Data":"05c817fcb54f8b0544d8f4535a30e90ac957e51d5fddffedb04bb96117c07a28"} Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.371931 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" event={"ID":"f2857fcc-84c2-42ab-81d7-5e430db9cfba","Type":"ContainerStarted","Data":"3187e96982f3615bd579e5fcf8eeb7d0176b994fffb615b11e16a52d9cef80dd"} Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.397774 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cgznb" podStartSLOduration=4.28759628 podStartE2EDuration="1m24.397744067s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.682288531 +0000 UTC m=+243.280221060" lastFinishedPulling="2026-03-20 13:35:43.792436318 +0000 UTC m=+323.390368847" observedRunningTime="2026-03-20 13:35:45.395082713 +0000 UTC m=+324.993015262" watchObservedRunningTime="2026-03-20 13:35:45.397744067 +0000 UTC m=+324.995676596" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.419668 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" podStartSLOduration=3.419623627 podStartE2EDuration="3.419623627s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:45.417045756 +0000 UTC m=+325.014978305" watchObservedRunningTime="2026-03-20 13:35:45.419623627 +0000 UTC m=+325.017556166" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.951133 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" probeResult="failure" output=< Mar 20 13:35:45 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:35:45 crc kubenswrapper[4755]: > Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.382967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerStarted","Data":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.384877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.390061 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.405251 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nlslg" podStartSLOduration=2.848870367 podStartE2EDuration="1m23.405221457s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="2026-03-20 13:34:24.867216712 +0000 UTC m=+244.465149241" lastFinishedPulling="2026-03-20 13:35:45.423567802 +0000 UTC m=+325.021500331" observedRunningTime="2026-03-20 13:35:46.403178462 +0000 UTC m=+326.001110991" watchObservedRunningTime="2026-03-20 13:35:46.405221457 +0000 UTC m=+326.003153996" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.424792 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.425599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428206 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428844 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428866 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.429746 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.430486 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.430899 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.443083 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.449994 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.530811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.530918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632934 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.633064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.633166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.634356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.634519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.635212 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.645217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.657949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.742545 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.073055 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.255712 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.256752 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.256876 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257541 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257696 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257742 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257690 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.258129 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.258315 4755 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.292870 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.311804 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312083 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312105 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312118 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312125 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312138 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312145 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312156 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312162 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312169 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312175 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312275 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312282 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312290 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312296 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312307 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312313 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312417 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312427 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312434 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312446 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312453 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312459 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312466 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312474 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312570 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312579 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312697 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312824 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312832 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342110 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342325 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.349699 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.395041 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396193 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396812 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" exitCode=0 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396834 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" exitCode=2 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396902 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.399553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" event={"ID":"c9017aa0-1c82-4753-b448-b07556e89259","Type":"ContainerStarted","Data":"08f942cf97d3c7a747d20319a863fdb751e92d12dadc8b794d2edd09a8ddba05"} Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.399587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" event={"ID":"c9017aa0-1c82-4753-b448-b07556e89259","Type":"ContainerStarted","Data":"e83f0d09983e19d308152d03cd48e7dcf807481871cc70e9fa4f974545dbacc3"} Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443382 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443384 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443434 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443534 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443572 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.444025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443963 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.589477 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: W0320 13:35:47.609949 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050 WatchSource:0}: Error finding container 6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050: Status 404 returned error can't find the container with id 6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.411148 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.412311 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.412364 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.415456 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.415525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.417503 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.417984 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.420184 4755 generic.go:334] "Generic (PLEG): container finished" podID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerID="fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.420859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerDied","Data":"fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.421409 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.422068 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.423098 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.423621 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424227 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424621 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424937 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.425204 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428157 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428454 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428762 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.429032 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.429307 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.747111 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.747832 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: E0320 13:35:49.374591 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.751140 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.752438 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.752992 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.753385 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.883872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884119 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock" (OuterVolumeSpecName: "var-lock") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884255 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884624 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884641 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.892922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.986250 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerDied","Data":"ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e"} Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438194 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438198 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.454431 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.454892 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.455234 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.863556 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.865184 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.865818 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.866281 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.866750 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.867300 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897484 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897576 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897662 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897715 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898055 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898079 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898098 4755 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.229605 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.230234 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.230725 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.231364 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.236639 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.452637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.454850 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" exitCode=0 Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.454954 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.455001 4755 scope.go:117] "RemoveContainer" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.456128 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.457029 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.457804 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.459037 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.460322 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.460855 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.461097 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.461319 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.483039 4755 scope.go:117] "RemoveContainer" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.508150 4755 scope.go:117] "RemoveContainer" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.529025 4755 scope.go:117] "RemoveContainer" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.554737 4755 scope.go:117] "RemoveContainer" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.614053 4755 scope.go:117] "RemoveContainer" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.638190 4755 scope.go:117] "RemoveContainer" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.638925 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": container with ID starting with 40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4 not found: ID does not exist" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639020 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4"} err="failed to get container status \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": rpc error: code = NotFound desc = could not find container \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": container with ID starting with 40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639070 4755 scope.go:117] "RemoveContainer" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.639512 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": container with ID starting with ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c not found: ID does not exist" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639557 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c"} err="failed to get container status \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": rpc error: code = NotFound desc = could not find container \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": container with ID starting with ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639591 4755 scope.go:117] "RemoveContainer" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.639975 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": container with ID starting with 91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462 not found: ID does not exist" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640021 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462"} err="failed to get container status \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": rpc error: code = NotFound desc = could not find container \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": container with ID starting with 91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640053 4755 scope.go:117] "RemoveContainer" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.640632 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": container with ID starting with 280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78 not found: ID does not exist" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640706 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78"} err="failed to get container status \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": rpc error: code = NotFound desc = could not find container \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": container with ID starting with 280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640731 4755 scope.go:117] "RemoveContainer" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.641077 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": container with ID starting with 26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c not found: ID does not exist" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641140 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c"} err="failed to get container status \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": rpc error: code = NotFound desc = could not find container \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": container with ID starting with 26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641180 4755 scope.go:117] "RemoveContainer" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.641603 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": container with ID starting with 2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e not found: ID does not exist" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641627 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e"} err="failed to get container status \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": rpc error: code = NotFound desc = could not find container \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": container with ID starting with 2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.798102 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.798168 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.869993 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.871125 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.871737 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.872345 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.873087 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.873687 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.937629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.937725 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.051574 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.052549 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.053693 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.054366 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.054996 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.055519 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.056038 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.219950 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.221350 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.222182 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.222755 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.223428 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.224289 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.224810 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.225400 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.538540 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.539531 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.540364 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.541169 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.541637 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542049 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542550 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543093 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543586 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543994 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.544626 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.545351 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.545867 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.546421 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.546953 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.715864 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.716425 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717179 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717561 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717909 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.717953 4755 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.718268 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="200ms" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.919375 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="400ms" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.955246 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.955337 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.016689 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.017790 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018286 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018578 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018889 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019193 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019424 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019717 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: E0320 13:35:54.337220 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="800ms" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.571539 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.572591 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.573192 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.573701 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.574175 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.574623 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.575190 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.575771 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.967008 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.968406 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.969573 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.970395 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.971008 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.971605 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.972165 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.972776 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.973337 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.054343 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.055084 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.055718 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.056345 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.056633 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057004 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057371 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057565 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057772 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: E0320 13:35:55.139397 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="1.6s" Mar 20 13:35:56 crc kubenswrapper[4755]: E0320 13:35:56.741352 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="3.2s" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.254191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.254642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.255141 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.255349 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.255972 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.256082 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.356063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.356535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.357339 4755 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.357498 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.255596 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.255632 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.256197 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:00.256155547 +0000 UTC m=+459.854088116 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.256261 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:00.256235629 +0000 UTC m=+459.854168188 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.356912 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.356952 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: W0320 13:35:58.357751 4755 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.357864 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:58 crc kubenswrapper[4755]: W0320 13:35:58.945827 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.945922 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.225075 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.225760 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226113 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226434 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226838 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.227250 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.227772 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.228121 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.228591 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.247601 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.247649 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.248162 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.248698 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: W0320 13:35:59.272815 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df WatchSource:0}: Error finding container f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df: Status 404 returned error can't find the container with id f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358129 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358163 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358190 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358201 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358303 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:01.358270475 +0000 UTC m=+460.956203014 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358336 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:01.358323797 +0000 UTC m=+460.956256406 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.375756 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.521181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df"} Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.942828 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="6.4s" Mar 20 13:36:00 crc kubenswrapper[4755]: W0320 13:36:00.096754 4755 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.096873 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:36:00 crc kubenswrapper[4755]: W0320 13:36:00.118384 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.118574 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.532798 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535303 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535384 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea" exitCode=1 Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea"} Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.536304 4755 scope.go:117] "RemoveContainer" containerID="285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.536862 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.537554 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.538549 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539071 4755 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="65b16727d1e2d3ada1de65c7caf024709ec513abe901df42225ca09d63835f49" exitCode=0 Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"65b16727d1e2d3ada1de65c7caf024709ec513abe901df42225ca09d63835f49"} Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539229 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539420 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539447 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539882 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.540112 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.540448 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.540839 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.541275 4755 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.541701 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542208 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542554 4755 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542913 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.543429 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.544100 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.544614 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.545252 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.545728 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.546193 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.549932 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.552199 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.552303 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11934a74662a01bf0f322f88ccd25f18ae746365df5aba2c83fb9bf72d79a6a6"} Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.556222 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d7d6ec40aa203d17a1aab627d7d3551eccdc43a15d2e7018b643b749da4273b1"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.225951 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.231173 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6901cc8a30e124ad290689add14419d01602dcab403eb96bd0e010a281f78c19"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c291fcf4504b3c8fd028f59d69391ac5e780fe39587e60eef518bb784028716"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7a9fbeddb52e58cf66cc8cd7609fa52e9c30f8ad74e2ee62ac749cf8edfe5eb4"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564610 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ffb3d5cc1b398ae6bc0b008cf6714afe6e3b0c0db380db488c104a7988c8d40a"} Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579188 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579234 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579390 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.248920 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.249270 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.263560 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:05 crc kubenswrapper[4755]: I0320 13:36:05.052100 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:36:05 crc kubenswrapper[4755]: I0320 13:36:05.952812 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:36:07 crc kubenswrapper[4755]: I0320 13:36:07.698534 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.590370 4755 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.615118 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.615174 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.620602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.623560 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9f8b41c-3dfa-4aed-92f0-fe3c7dedcba8" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.671544 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:36:09 crc kubenswrapper[4755]: I0320 13:36:09.622211 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:09 crc kubenswrapper[4755]: I0320 13:36:09.622256 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.268249 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" containerID="cri-o://275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" gracePeriod=15 Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.634074 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerID="275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" exitCode=0 Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.634271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerDied","Data":"275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d"} Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.758809 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772674 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772738 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772898 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772918 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772938 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.773003 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.773492 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774138 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774960 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.775583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.781131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.781877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7" (OuterVolumeSpecName: "kube-api-access-vr2x7") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "kube-api-access-vr2x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.784187 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.784370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785450 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.788121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.790918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874049 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874085 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874095 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874105 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874115 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874126 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874135 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874145 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874153 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874161 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874172 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874182 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874190 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874199 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.255744 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9f8b41c-3dfa-4aed-92f0-fe3c7dedcba8" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerDied","Data":"9edc35520733cdbb8ffbbdcc2f02ec6ef4e5e7ada3cc88f2fa7d388e53bb80dd"} Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643202 4755 scope.go:117] "RemoveContainer" containerID="275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643222 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:36:15 crc kubenswrapper[4755]: E0320 13:36:15.251823 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:36:16 crc kubenswrapper[4755]: E0320 13:36:16.505137 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:36:16 crc kubenswrapper[4755]: E0320 13:36:16.532051 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:36:17 crc kubenswrapper[4755]: I0320 13:36:17.484324 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.616997 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.795444 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.809430 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.954091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.139170 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.217179 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.246035 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.468541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.640096 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.077973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.121848 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.145064 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.297643 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.347165 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.357284 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.501505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.782306 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.785043 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.790365 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.833727 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.117098 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.281014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.321191 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.364123 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.372908 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.390908 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.413756 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.514403 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.541136 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.568808 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.886364 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.119587 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.132003 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.251560 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.281089 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.327717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.359038 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.431599 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.435963 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.699798 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.819205 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.915928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.950209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.034257 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.090327 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.121957 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.150450 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.187857 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.236625 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.247342 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.271045 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.305955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.306552 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.544509 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.639519 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.662346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.902596 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.957638 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.958730 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.058268 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.076185 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.099973 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.134142 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.146007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.151417 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.183524 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.239498 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.303530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.385704 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.469729 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.503355 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.503817 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.507202 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.508803 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.859627 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.870672 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.927632 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.931398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.982724 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.008281 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.073815 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.129204 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.197673 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.227982 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.257393 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.360928 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.382238 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.415979 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.464915 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.525029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.550162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.565528 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.623218 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.686948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.688397 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.793484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.845396 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.956377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.043316 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.425942 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.560169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.838114 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.874526 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.882946 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.905236 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.983711 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.071382 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.209544 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.276285 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.280448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.446176 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.522548 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.529858 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.541505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.552318 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.692782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.752574 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.804151 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.810853 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.831155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.839350 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.886006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.938740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.962769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.982450 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.984007 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.985342 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" podStartSLOduration=45.985316454 podStartE2EDuration="45.985316454s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:07.83325629 +0000 UTC m=+347.431188829" watchObservedRunningTime="2026-03-20 13:36:27.985316454 +0000 UTC m=+367.583249013" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.987991 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.987971598 podStartE2EDuration="40.987971598s" podCreationTimestamp="2026-03-20 13:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:07.849053539 +0000 UTC m=+347.446986098" watchObservedRunningTime="2026-03-20 13:36:27.987971598 +0000 UTC m=+367.585904167" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.993244 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.993341 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.997975 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.001018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.021686 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.021634891 podStartE2EDuration="20.021634891s" podCreationTimestamp="2026-03-20 13:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:28.019951577 +0000 UTC m=+367.617884136" watchObservedRunningTime="2026-03-20 13:36:28.021634891 +0000 UTC m=+367.619567460" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.056524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.059123 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.095969 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.149482 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.186952 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.224920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.225359 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.228815 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.413220 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.476077 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.496289 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.571183 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.582733 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.646361 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.727310 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:28 crc kubenswrapper[4755]: E0320 13:36:28.728030 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728080 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: E0320 13:36:28.728108 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728126 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728472 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728521 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.730437 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.732332 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733276 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733618 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733734 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733867 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.737420 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.738350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.738865 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739090 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739118 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739325 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743267 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743679 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743907 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743984 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.744740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.793229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.794077 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.796443 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.799099 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841815 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841856 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842107 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.910576 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.921816 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.921881 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943237 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943296 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943391 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943444 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943493 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943608 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945856 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945917 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.946032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.947455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955845 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.963150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.963314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.966877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.975159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.997604 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.077267 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.082679 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.085022 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.094227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.116857 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.152178 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.173497 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.176251 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.235502 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" path="/var/lib/kubelet/pods/1ef1c7ef-1429-4467-abb5-837ad56896fb/volumes" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.274687 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.285612 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.286614 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.355483 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.368169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.534171 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.687514 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.705925 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.717585 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.823822 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.899050 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.907922 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.928079 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.959961 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.991092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.141021 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.179938 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.225703 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.245692 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.282062 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.310793 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.385275 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.470598 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.470995 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" gracePeriod=5 Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.476253 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.481121 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.495587 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.504717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.528013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.536645 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.611326 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.710778 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.727851 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.737087 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.784859 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.900268 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.945421 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.979725 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.080638 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.122476 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.166295 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.169565 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.210316 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.222581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.231547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:36:31 crc kubenswrapper[4755]: W0320 13:36:31.243393 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e1333e_6ba5_4ac5_969b_06d408650a35.slice/crio-ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a WatchSource:0}: Error finding container ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a: Status 404 returned error can't find the container with id ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.255342 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.410647 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.535029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.547520 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.555512 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.634197 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.634194 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.792381 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.816669 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerStarted","Data":"23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.819152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" event={"ID":"40e1333e-6ba5-4ac5-969b-06d408650a35","Type":"ContainerStarted","Data":"3d32abd23a34772ba75f29cee3392a59eea59802700a2a91776a6a69dcd6d646"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.819290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" event={"ID":"40e1333e-6ba5-4ac5-969b-06d408650a35","Type":"ContainerStarted","Data":"ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.823224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.836378 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.845532 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.856151 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" podStartSLOduration=46.856128034 podStartE2EDuration="46.856128034s" podCreationTimestamp="2026-03-20 13:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:31.852614522 +0000 UTC m=+371.450547091" watchObservedRunningTime="2026-03-20 13:36:31.856128034 +0000 UTC m=+371.454060573" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.946909 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.972545 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.016116 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.134535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.244092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.299597 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.312782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.587401 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.725210 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.830310 4755 generic.go:334] "Generic (PLEG): container finished" podID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerID="48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63" exitCode=0 Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.831177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerDied","Data":"48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63"} Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.855872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.876035 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.886140 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.945706 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.952714 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.009187 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.024228 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.082573 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.087886 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.244737 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.482554 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.550097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.614647 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.677205 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.819692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.829227 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.880439 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.883317 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.982531 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.058509 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.156693 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.182111 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.287071 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.316260 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.347508 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"8532b92f-bed9-41b0-bf0d-99afa5703048\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.355784 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz" (OuterVolumeSpecName: "kube-api-access-56gzz") pod "8532b92f-bed9-41b0-bf0d-99afa5703048" (UID: "8532b92f-bed9-41b0-bf0d-99afa5703048"). InnerVolumeSpecName "kube-api-access-56gzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.448939 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.614359 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.853445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerDied","Data":"23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a"} Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.854042 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.853610 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.016300 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.711246 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.864110 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.864197 4755 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" exitCode=137 Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.946068 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.087209 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.087310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.283990 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284184 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284261 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284435 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284606 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285000 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285029 4755 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285047 4755 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285065 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.296721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.386543 4755 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875561 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875757 4755 scope.go:117] "RemoveContainer" containerID="0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.241792 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.242369 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.257858 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.257945 4755 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="52cd97fa-790d-4793-945c-f2ccf7fc8986" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.265331 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.265396 4755 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="52cd97fa-790d-4793-945c-f2ccf7fc8986" Mar 20 13:36:53 crc kubenswrapper[4755]: I0320 13:36:53.970908 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:53 crc kubenswrapper[4755]: I0320 13:36:53.972027 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" containerID="cri-o://b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" gracePeriod=2 Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.169488 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.170249 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" containerID="cri-o://d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" gracePeriod=2 Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.469203 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.596339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.596477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597409 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities" (OuterVolumeSpecName: "utilities") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597471 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597860 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.606748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd" (OuterVolumeSpecName: "kube-api-access-8rpkd") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "kube-api-access-8rpkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.662056 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.680390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.700496 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.700915 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802628 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.803765 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities" (OuterVolumeSpecName: "utilities") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.806878 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8" (OuterVolumeSpecName: "kube-api-access-pfxf8") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "kube-api-access-pfxf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.856969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.904946 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.905310 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.905435 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009229 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" exitCode=0 Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009330 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009336 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"b82219efa86cff3e92cd1609c0f3a02dacbb886afd0558266c139f378ee30512"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009412 4755 scope.go:117] "RemoveContainer" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015148 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" exitCode=0 Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015263 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015271 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.042378 4755 scope.go:117] "RemoveContainer" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.056834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.069434 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.073365 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.076488 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.082554 4755 scope.go:117] "RemoveContainer" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.103381 4755 scope.go:117] "RemoveContainer" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.103864 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": container with ID starting with b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e not found: ID does not exist" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.103993 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} err="failed to get container status \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": rpc error: code = NotFound desc = could not find container \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": container with ID starting with b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104098 4755 scope.go:117] "RemoveContainer" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.104505 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": container with ID starting with 3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19 not found: ID does not exist" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104545 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19"} err="failed to get container status \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": rpc error: code = NotFound desc = could not find container \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": container with ID starting with 3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19 not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104580 4755 scope.go:117] "RemoveContainer" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.105715 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": container with ID starting with 707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d not found: ID does not exist" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.105746 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d"} err="failed to get container status \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": rpc error: code = NotFound desc = could not find container \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": container with ID starting with 707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.105763 4755 scope.go:117] "RemoveContainer" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.125274 4755 scope.go:117] "RemoveContainer" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.146567 4755 scope.go:117] "RemoveContainer" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.164714 4755 scope.go:117] "RemoveContainer" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.166256 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": container with ID starting with d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a not found: ID does not exist" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166306 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} err="failed to get container status \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": rpc error: code = NotFound desc = could not find container \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": container with ID starting with d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166341 4755 scope.go:117] "RemoveContainer" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.166625 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": container with ID starting with b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d not found: ID does not exist" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166654 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d"} err="failed to get container status \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": rpc error: code = NotFound desc = could not find container \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": container with ID starting with b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166696 4755 scope.go:117] "RemoveContainer" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.167209 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": container with ID starting with a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef not found: ID does not exist" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.167276 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef"} err="failed to get container status \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": rpc error: code = NotFound desc = could not find container \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": container with ID starting with a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.232590 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" path="/var/lib/kubelet/pods/184aa529-45c4-42c9-8eee-04bd18fba718/volumes" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.233551 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" path="/var/lib/kubelet/pods/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0/volumes" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.372553 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.373034 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" containerID="cri-o://a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" gracePeriod=2 Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.882928 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.941840 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.941903 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.942006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.943197 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities" (OuterVolumeSpecName: "utilities") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.949309 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx" (OuterVolumeSpecName: "kube-api-access-nblfx") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "kube-api-access-nblfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.994259 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034149 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" exitCode=0 Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034221 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034240 4755 scope.go:117] "RemoveContainer" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e"} Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045485 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045955 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045969 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.057846 4755 scope.go:117] "RemoveContainer" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.069607 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.074748 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.088226 4755 scope.go:117] "RemoveContainer" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101108 4755 scope.go:117] "RemoveContainer" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.101775 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": container with ID starting with a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2 not found: ID does not exist" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101823 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} err="failed to get container status \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": rpc error: code = NotFound desc = could not find container \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": container with ID starting with a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101855 4755 scope.go:117] "RemoveContainer" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.102292 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": container with ID starting with b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98 not found: ID does not exist" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102366 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98"} err="failed to get container status \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": rpc error: code = NotFound desc = could not find container \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": container with ID starting with b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102408 4755 scope.go:117] "RemoveContainer" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.102856 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": container with ID starting with b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64 not found: ID does not exist" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102889 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64"} err="failed to get container status \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": rpc error: code = NotFound desc = could not find container \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": container with ID starting with b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.234884 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" path="/var/lib/kubelet/pods/ce4d5763-1786-4b87-8497-0c65da46f446/volumes" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.270425 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272355 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272395 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272431 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272452 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272500 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272520 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272545 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272576 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272594 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272626 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272646 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272704 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272722 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272753 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272772 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272820 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272852 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272871 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272893 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272912 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273238 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273265 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273285 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273300 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273312 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.274119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.298261 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.359972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.399738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461794 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.462810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.463641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.464081 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.469894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.470406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.486138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.493039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.613862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:12 crc kubenswrapper[4755]: I0320 13:37:12.098762 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:12 crc kubenswrapper[4755]: I0320 13:37:12.145693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" event={"ID":"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5","Type":"ContainerStarted","Data":"5b1596ed1b6752a14b0f23edee121913b572c520f5dd7bb0b18894f636bfdfae"} Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.154693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" event={"ID":"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5","Type":"ContainerStarted","Data":"d3e00949a794bbe84cf9b546ffe377923acff43ce4c3c50704838528d8d3e89a"} Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.155237 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.194335 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" podStartSLOduration=2.194311542 podStartE2EDuration="2.194311542s" podCreationTimestamp="2026-03-20 13:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:13.188702768 +0000 UTC m=+412.786635357" watchObservedRunningTime="2026-03-20 13:37:13.194311542 +0000 UTC m=+412.792244071" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.619897 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.620850 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" containerID="cri-o://bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.642717 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.643043 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" containerID="cri-o://4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.655808 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.656166 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" containerID="cri-o://5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.684735 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.685055 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" containerID="cri-o://2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.687446 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.687778 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" containerID="cri-o://ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.695683 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.696404 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.696481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744424 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744480 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.849032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.857102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.868019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.142855 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.148687 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.160801 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.165958 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.194335 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.218826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250564 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250622 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250699 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250926 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.259087 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb" (OuterVolumeSpecName: "kube-api-access-qsjvb") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "kube-api-access-qsjvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.259563 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6" (OuterVolumeSpecName: "kube-api-access-m5hm6") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "kube-api-access-m5hm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.260448 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.260747 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities" (OuterVolumeSpecName: "utilities") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.264524 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities" (OuterVolumeSpecName: "utilities") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265752 4755 generic.go:334] "Generic (PLEG): container finished" podID="eca3198b-684d-4a52-b4aa-858ced996bae" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerDied","Data":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerDied","Data":"ffd17bcea5582e9144ff86b2de342c1b3c61951742cefde886baf98d6e66252d"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.266031 4755 scope.go:117] "RemoveContainer" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.266138 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.267050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities" (OuterVolumeSpecName: "utilities") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.267196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities" (OuterVolumeSpecName: "utilities") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9" (OuterVolumeSpecName: "kube-api-access-w5qg9") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "kube-api-access-w5qg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277441 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277498 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.283773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"0ab76dafe853da1151a253ddbccefd2f71d9bf47c5abfc10da67278b7f81253e"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.280105 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.280817 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5" (OuterVolumeSpecName: "kube-api-access-hchb5") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "kube-api-access-hchb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277632 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285006 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285128 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285138 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285432 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc" (OuterVolumeSpecName: "kube-api-access-g2rrc") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "kube-api-access-g2rrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.290915 4755 scope.go:117] "RemoveContainer" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.293171 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": container with ID starting with 5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12 not found: ID does not exist" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.293204 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} err="failed to get container status \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": rpc error: code = NotFound desc = could not find container \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": container with ID starting with 5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.293226 4755 scope.go:117] "RemoveContainer" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.294238 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296267 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296451 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"ea85ece18daec304b9cecefa9ca55b3c7ddbfc128e021ebc4bfd2b1a692b4346"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296522 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301251 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"28734b0d2914118b3d9d2819be5a8fd3a2768be1a04f071ed6cc45a5baf248f6"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301356 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.309985 4755 scope.go:117] "RemoveContainer" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.334501 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.338202 4755 scope.go:117] "RemoveContainer" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.343624 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352370 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352428 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352442 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352474 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352489 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352503 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352514 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352528 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352561 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352572 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352583 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352594 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352607 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352633 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.361339 4755 scope.go:117] "RemoveContainer" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.362481 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": container with ID starting with bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7 not found: ID does not exist" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.362546 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} err="failed to get container status \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": rpc error: code = NotFound desc = could not find container \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": container with ID starting with bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.362583 4755 scope.go:117] "RemoveContainer" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.363001 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": container with ID starting with 76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12 not found: ID does not exist" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363048 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12"} err="failed to get container status \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": rpc error: code = NotFound desc = could not find container \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": container with ID starting with 76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363082 4755 scope.go:117] "RemoveContainer" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.363387 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": container with ID starting with 6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2 not found: ID does not exist" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363499 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2"} err="failed to get container status \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": rpc error: code = NotFound desc = could not find container \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": container with ID starting with 6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363572 4755 scope.go:117] "RemoveContainer" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.400702 4755 scope.go:117] "RemoveContainer" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.410082 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.417894 4755 scope.go:117] "RemoveContainer" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: W0320 13:37:29.421209 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1fc18c_b364_439b_926f_12fe310d0917.slice/crio-2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f WatchSource:0}: Error finding container 2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f: Status 404 returned error can't find the container with id 2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439292 4755 scope.go:117] "RemoveContainer" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.439913 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": container with ID starting with 2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247 not found: ID does not exist" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439957 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} err="failed to get container status \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": rpc error: code = NotFound desc = could not find container \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": container with ID starting with 2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439985 4755 scope.go:117] "RemoveContainer" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.440441 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": container with ID starting with 49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de not found: ID does not exist" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.440505 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de"} err="failed to get container status \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": rpc error: code = NotFound desc = could not find container \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": container with ID starting with 49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.440550 4755 scope.go:117] "RemoveContainer" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.441588 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": container with ID starting with e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857 not found: ID does not exist" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.441614 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857"} err="failed to get container status \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": rpc error: code = NotFound desc = could not find container \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": container with ID starting with e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.441630 4755 scope.go:117] "RemoveContainer" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.457369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.457788 4755 scope.go:117] "RemoveContainer" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.488531 4755 scope.go:117] "RemoveContainer" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.519871 4755 scope.go:117] "RemoveContainer" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520262 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": container with ID starting with ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520 not found: ID does not exist" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520296 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} err="failed to get container status \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": rpc error: code = NotFound desc = could not find container \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": container with ID starting with ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520317 4755 scope.go:117] "RemoveContainer" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520556 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": container with ID starting with 1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85 not found: ID does not exist" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520588 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} err="failed to get container status \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": rpc error: code = NotFound desc = could not find container \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": container with ID starting with 1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520606 4755 scope.go:117] "RemoveContainer" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520829 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": container with ID starting with 388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad not found: ID does not exist" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520849 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad"} err="failed to get container status \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": rpc error: code = NotFound desc = could not find container \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": container with ID starting with 388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520866 4755 scope.go:117] "RemoveContainer" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.539111 4755 scope.go:117] "RemoveContainer" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.554793 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.559280 4755 scope.go:117] "RemoveContainer" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.571945 4755 scope.go:117] "RemoveContainer" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.572234 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": container with ID starting with 4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476 not found: ID does not exist" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572279 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} err="failed to get container status \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": rpc error: code = NotFound desc = could not find container \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": container with ID starting with 4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572303 4755 scope.go:117] "RemoveContainer" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.572642 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": container with ID starting with a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee not found: ID does not exist" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572754 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} err="failed to get container status \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": rpc error: code = NotFound desc = could not find container \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": container with ID starting with a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572804 4755 scope.go:117] "RemoveContainer" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.573365 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": container with ID starting with eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703 not found: ID does not exist" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.573400 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703"} err="failed to get container status \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": rpc error: code = NotFound desc = could not find container \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": container with ID starting with eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.596778 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.600245 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.630006 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.633311 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.648224 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.668601 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.694037 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.702852 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.705668 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.709123 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.236889 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237203 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237241 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237254 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237273 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237286 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237307 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237320 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237340 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237354 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237369 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237383 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237399 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237410 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237433 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237446 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237493 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237508 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237521 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237540 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237553 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237567 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237580 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237598 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237609 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237781 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237800 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237816 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237836 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237863 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.239087 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.242478 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.254075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" event={"ID":"6d1fc18c-b364-439b-926f-12fe310d0917","Type":"ContainerStarted","Data":"7f2ba372670391d5fcf019a9e918249106823165de1f1a45210f90b435f1c486"} Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312342 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" event={"ID":"6d1fc18c-b364-439b-926f-12fe310d0917","Type":"ContainerStarted","Data":"2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f"} Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312473 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.316672 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.332509 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" podStartSLOduration=2.332469686 podStartE2EDuration="2.332469686s" podCreationTimestamp="2026-03-20 13:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:30.33044893 +0000 UTC m=+429.928381459" watchObservedRunningTime="2026-03-20 13:37:30.332469686 +0000 UTC m=+429.930402225" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390337 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390454 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.391610 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.392814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.426676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.555022 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.007062 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:31 crc kubenswrapper[4755]: W0320 13:37:31.014688 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1107b669_3bdf_4189_a37a_b79ddb758fff.slice/crio-355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f WatchSource:0}: Error finding container 355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f: Status 404 returned error can't find the container with id 355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.246279 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" path="/var/lib/kubelet/pods/2d2017d2-f4ee-4056-b350-cc313f3faeaf/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.250481 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" path="/var/lib/kubelet/pods/2db67acd-25db-47a7-80ea-da4065a60e23/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.251380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" path="/var/lib/kubelet/pods/887fa242-bd5e-40f5-8f6e-a81c6e976322/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.252232 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" path="/var/lib/kubelet/pods/e8e34571-6648-4e5e-b3e9-05f87454e19a/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.253089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" path="/var/lib/kubelet/pods/eca3198b-684d-4a52-b4aa-858ced996bae/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.253641 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.256372 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.256541 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.264608 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325271 4755 generic.go:334] "Generic (PLEG): container finished" podID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerID="ce76a7d0c9414180f481727757d7b93f76b91fd1dd18b729c9b739307d7f3e2f" exitCode=0 Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerDied","Data":"ce76a7d0c9414180f481727757d7b93f76b91fd1dd18b729c9b739307d7f3e2f"} Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325444 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerStarted","Data":"355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f"} Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.405932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406739 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406796 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.407034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.433296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.620027 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.621207 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.711250 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.134813 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:32 crc kubenswrapper[4755]: W0320 13:37:32.138199 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf483e049_5032_496f_8608_494e07922763.slice/crio-b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca WatchSource:0}: Error finding container b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca: Status 404 returned error can't find the container with id b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333181 4755 generic.go:334] "Generic (PLEG): container finished" podID="f483e049-5032-496f-8608-494e07922763" containerID="4a6873996a090c1ab71ee644f4d8f0225205dd1c16487fa9851d034e5bc18c2a" exitCode=0 Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerDied","Data":"4a6873996a090c1ab71ee644f4d8f0225205dd1c16487fa9851d034e5bc18c2a"} Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca"} Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.639990 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.641637 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.647755 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.652930 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729442 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.832048 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.857071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.980435 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.342579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694"} Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.345205 4755 generic.go:334] "Generic (PLEG): container finished" podID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerID="3e7af82c28d3451b21b1a29643459a1913ef0639ea44724458fcd3ef408c61b7" exitCode=0 Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.345298 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerDied","Data":"3e7af82c28d3451b21b1a29643459a1913ef0639ea44724458fcd3ef408c61b7"} Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.481245 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:33 crc kubenswrapper[4755]: W0320 13:37:33.486624 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504e1957_f41e_4927_927f_d5ac7e8eb625.slice/crio-49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78 WatchSource:0}: Error finding container 49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78: Status 404 returned error can't find the container with id 49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78 Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.634224 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.636352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.641102 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.646728 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759718 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861451 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.862032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.888094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.969031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.206197 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:34 crc kubenswrapper[4755]: W0320 13:37:34.212941 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b421640_e220_4567_8600_8e0ba78a981a.slice/crio-e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862 WatchSource:0}: Error finding container e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862: Status 404 returned error can't find the container with id e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.355182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerStarted","Data":"770d69d023566d2bc06547337ff64b3e1944607bd793540cdd3283051a76262e"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356516 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b421640-e220-4567-8600-8e0ba78a981a" containerID="6e7ad26e08ee1cb8fc5d82c6121820f29a80fe7890ca59176dcc322f331b168d" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356578 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerDied","Data":"6e7ad26e08ee1cb8fc5d82c6121820f29a80fe7890ca59176dcc322f331b168d"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360387 4755 generic.go:334] "Generic (PLEG): container finished" podID="504e1957-f41e-4927-927f-d5ac7e8eb625" containerID="696a395a41fc5c2cad8e0190521f19c702e94a769590b9c0630323b307518eaf" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360734 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerDied","Data":"696a395a41fc5c2cad8e0190521f19c702e94a769590b9c0630323b307518eaf"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.364548 4755 generic.go:334] "Generic (PLEG): container finished" podID="f483e049-5032-496f-8608-494e07922763" containerID="e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.364595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerDied","Data":"e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.403035 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-srzwn" podStartSLOduration=1.85891135 podStartE2EDuration="4.403008743s" podCreationTimestamp="2026-03-20 13:37:30 +0000 UTC" firstStartedPulling="2026-03-20 13:37:31.32642831 +0000 UTC m=+430.924360849" lastFinishedPulling="2026-03-20 13:37:33.870525723 +0000 UTC m=+433.468458242" observedRunningTime="2026-03-20 13:37:34.379866215 +0000 UTC m=+433.977798744" watchObservedRunningTime="2026-03-20 13:37:34.403008743 +0000 UTC m=+434.000941272" Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.371982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.374957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"3dc34b30bf3966ed77988b050ebc2a79979f98ab189b065770efcef17170409f"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.377204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.416864 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9s6q" podStartSLOduration=1.933811737 podStartE2EDuration="4.416837836s" podCreationTimestamp="2026-03-20 13:37:31 +0000 UTC" firstStartedPulling="2026-03-20 13:37:32.335284206 +0000 UTC m=+431.933216735" lastFinishedPulling="2026-03-20 13:37:34.818310305 +0000 UTC m=+434.416242834" observedRunningTime="2026-03-20 13:37:35.41266815 +0000 UTC m=+435.010600699" watchObservedRunningTime="2026-03-20 13:37:35.416837836 +0000 UTC m=+435.014770365" Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.390205 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b421640-e220-4567-8600-8e0ba78a981a" containerID="6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b" exitCode=0 Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.390289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerDied","Data":"6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b"} Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.393447 4755 generic.go:334] "Generic (PLEG): container finished" podID="504e1957-f41e-4927-927f-d5ac7e8eb625" containerID="628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0" exitCode=0 Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.395573 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerDied","Data":"628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.405971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"341d50118145d4c32134de2378d95edc5af47e2018c48c3096921a2849e7a30e"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.408768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"eec474a6b04a7c91815a071ff884e25ff0c39cc60f17c3469adceb3d7ee6d1f7"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.425166 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nql9k" podStartSLOduration=1.913511437 podStartE2EDuration="4.425146896s" podCreationTimestamp="2026-03-20 13:37:33 +0000 UTC" firstStartedPulling="2026-03-20 13:37:34.357767316 +0000 UTC m=+433.955699835" lastFinishedPulling="2026-03-20 13:37:36.869402745 +0000 UTC m=+436.467335294" observedRunningTime="2026-03-20 13:37:37.423859271 +0000 UTC m=+437.021791810" watchObservedRunningTime="2026-03-20 13:37:37.425146896 +0000 UTC m=+437.023079425" Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.451713 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6g8x4" podStartSLOduration=2.97094628 podStartE2EDuration="5.451688537s" podCreationTimestamp="2026-03-20 13:37:32 +0000 UTC" firstStartedPulling="2026-03-20 13:37:34.362083436 +0000 UTC m=+433.960015965" lastFinishedPulling="2026-03-20 13:37:36.842825683 +0000 UTC m=+436.440758222" observedRunningTime="2026-03-20 13:37:37.447168842 +0000 UTC m=+437.045101401" watchObservedRunningTime="2026-03-20 13:37:37.451688537 +0000 UTC m=+437.049621066" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.556091 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.556199 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.601997 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.518645 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.621987 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.622055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.677821 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9s6q" podUID="f483e049-5032-496f-8608-494e07922763" containerName="registry-server" probeResult="failure" output=< Mar 20 13:37:42 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:37:42 crc kubenswrapper[4755]: > Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.981516 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.981592 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.052934 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.504838 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.970210 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.970309 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:44 crc kubenswrapper[4755]: I0320 13:37:44.025100 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:44 crc kubenswrapper[4755]: I0320 13:37:44.532634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:51 crc kubenswrapper[4755]: I0320 13:37:51.677393 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:51 crc kubenswrapper[4755]: I0320 13:37:51.737355 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:56 crc kubenswrapper[4755]: I0320 13:37:56.752679 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" containerID="cri-o://13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" gracePeriod=30 Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.178247 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314573 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314904 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314933 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.315048 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.315647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.316918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.325922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm" (OuterVolumeSpecName: "kube-api-access-gpvlm") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "kube-api-access-gpvlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.326749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.328842 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.329022 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.334764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.336907 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416669 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416708 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416718 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416735 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416746 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416754 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416762 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552207 4755 generic.go:334] "Generic (PLEG): container finished" podID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" exitCode=0 Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerDied","Data":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552309 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552338 4755 scope.go:117] "RemoveContainer" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerDied","Data":"500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6"} Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.570586 4755 scope.go:117] "RemoveContainer" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: E0320 13:37:57.571043 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": container with ID starting with 13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3 not found: ID does not exist" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.571071 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} err="failed to get container status \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": rpc error: code = NotFound desc = could not find container \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": container with ID starting with 13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3 not found: ID does not exist" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.587775 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.592145 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:59 crc kubenswrapper[4755]: I0320 13:37:59.231627 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" path="/var/lib/kubelet/pods/408c6869-42d8-4cbc-a261-57fb45f0d666/volumes" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133576 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: E0320 13:38:00.133848 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133861 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133958 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.134330 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.136283 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.136561 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.137315 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.149418 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.258207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.258784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.259066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.259546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.268771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.361239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.383374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.463676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.526974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.946718 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: W0320 13:38:00.953849 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb576c19_7f49_40ac_987b_5eefb5db31ce.slice/crio-0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25 WatchSource:0}: Error finding container 0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25: Status 404 returned error can't find the container with id 0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25 Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.378404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.378581 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.385818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.386422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.526185 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.526209 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.603466 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0a971fb54dae1dc6dd5a7cb86fdb33719d285d5997bdc789a750b5489dac589f"} Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.606197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerStarted","Data":"0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25"} Mar 20 13:38:01 crc kubenswrapper[4755]: W0320 13:38:01.893986 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad WatchSource:0}: Error finding container ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad: Status 404 returned error can't find the container with id ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad Mar 20 13:38:02 crc kubenswrapper[4755]: W0320 13:38:02.088248 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980 WatchSource:0}: Error finding container 2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980: Status 404 returned error can't find the container with id 2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980 Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.613873 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"14c03d5bb6f71fb61256b58cc54b877069e5a78886068524dc350fc5cfb18820"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.615399 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fddd2f491ddff3a4b7735de2e6eac05470ce4139d55203831b9c334a4a28de32"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.615431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.616144 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.617003 4755 generic.go:334] "Generic (PLEG): container finished" podID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerID="0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc" exitCode=0 Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.617129 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerDied","Data":"0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.618761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"aab4e784fd739fffb08708c0e260c074b42ae552f5d0712d1fd356ef51556faf"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.618793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad"} Mar 20 13:38:03 crc kubenswrapper[4755]: I0320 13:38:03.880762 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.013205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"cb576c19-7f49-40ac-987b-5eefb5db31ce\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.019070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf" (OuterVolumeSpecName: "kube-api-access-nrnnf") pod "cb576c19-7f49-40ac-987b-5eefb5db31ce" (UID: "cb576c19-7f49-40ac-987b-5eefb5db31ce"). InnerVolumeSpecName "kube-api-access-nrnnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.113999 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.640922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerDied","Data":"0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25"} Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.640966 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.641351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.949797 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.953868 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:38:05 crc kubenswrapper[4755]: I0320 13:38:05.234832 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" path="/var/lib/kubelet/pods/28deea0d-d80e-422b-a0c2-40670570aa68/volumes" Mar 20 13:38:06 crc kubenswrapper[4755]: I0320 13:38:06.751445 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:06 crc kubenswrapper[4755]: I0320 13:38:06.751519 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:36 crc kubenswrapper[4755]: I0320 13:38:36.751971 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:36 crc kubenswrapper[4755]: I0320 13:38:36.753802 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:41 crc kubenswrapper[4755]: I0320 13:38:41.533188 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.756061 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.757068 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.757143 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.758017 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.758120 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" gracePeriod=600 Mar 20 13:39:06 crc kubenswrapper[4755]: E0320 13:39:06.800486 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb406f6_1a26_4eea_84ac_e55f5232900c.slice/crio-d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099142 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" exitCode=0 Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099802 4755 scope.go:117] "RemoveContainer" containerID="bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.155019 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: E0320 13:40:00.156731 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.156766 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.156980 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.158264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.160931 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.161161 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.162014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.162143 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.337321 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.439236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.472976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.499308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.773970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.785639 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:40:01 crc kubenswrapper[4755]: I0320 13:40:01.525297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerStarted","Data":"14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783"} Mar 20 13:40:02 crc kubenswrapper[4755]: I0320 13:40:02.537728 4755 generic.go:334] "Generic (PLEG): container finished" podID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerID="5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6" exitCode=0 Mar 20 13:40:02 crc kubenswrapper[4755]: I0320 13:40:02.537884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerDied","Data":"5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6"} Mar 20 13:40:03 crc kubenswrapper[4755]: I0320 13:40:03.878899 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:03.997542 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"6af9d427-765f-4d25-9603-e0b39103e2cc\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.008689 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc" (OuterVolumeSpecName: "kube-api-access-jsvfc") pod "6af9d427-765f-4d25-9603-e0b39103e2cc" (UID: "6af9d427-765f-4d25-9603-e0b39103e2cc"). InnerVolumeSpecName "kube-api-access-jsvfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.099694 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerDied","Data":"14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783"} Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559823 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559895 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.955080 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.959118 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:40:05 crc kubenswrapper[4755]: I0320 13:40:05.241968 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" path="/var/lib/kubelet/pods/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34/volumes" Mar 20 13:41:22 crc kubenswrapper[4755]: I0320 13:41:22.088835 4755 scope.go:117] "RemoveContainer" containerID="5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a" Mar 20 13:41:22 crc kubenswrapper[4755]: I0320 13:41:22.137185 4755 scope.go:117] "RemoveContainer" containerID="6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9" Mar 20 13:41:36 crc kubenswrapper[4755]: I0320 13:41:36.751561 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:36 crc kubenswrapper[4755]: I0320 13:41:36.753813 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.153821 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:00 crc kubenswrapper[4755]: E0320 13:42:00.154975 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.154994 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.155123 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.155645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.159868 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.161628 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.167432 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.199514 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.251166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.352856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.383807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.518524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.795844 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:01 crc kubenswrapper[4755]: I0320 13:42:01.492450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerStarted","Data":"2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9"} Mar 20 13:42:02 crc kubenswrapper[4755]: I0320 13:42:02.501010 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerStarted","Data":"dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac"} Mar 20 13:42:02 crc kubenswrapper[4755]: I0320 13:42:02.523407 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" podStartSLOduration=1.191208469 podStartE2EDuration="2.523380372s" podCreationTimestamp="2026-03-20 13:42:00 +0000 UTC" firstStartedPulling="2026-03-20 13:42:00.811746822 +0000 UTC m=+700.409679351" lastFinishedPulling="2026-03-20 13:42:02.143918685 +0000 UTC m=+701.741851254" observedRunningTime="2026-03-20 13:42:02.51817238 +0000 UTC m=+702.116104919" watchObservedRunningTime="2026-03-20 13:42:02.523380372 +0000 UTC m=+702.121312911" Mar 20 13:42:03 crc kubenswrapper[4755]: I0320 13:42:03.509574 4755 generic.go:334] "Generic (PLEG): container finished" podID="320783b7-7554-4157-b6cd-143d787dc30b" containerID="dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac" exitCode=0 Mar 20 13:42:03 crc kubenswrapper[4755]: I0320 13:42:03.509708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerDied","Data":"dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac"} Mar 20 13:42:04 crc kubenswrapper[4755]: I0320 13:42:04.854396 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.025304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"320783b7-7554-4157-b6cd-143d787dc30b\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.034818 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj" (OuterVolumeSpecName: "kube-api-access-ss9kj") pod "320783b7-7554-4157-b6cd-143d787dc30b" (UID: "320783b7-7554-4157-b6cd-143d787dc30b"). InnerVolumeSpecName "kube-api-access-ss9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.127428 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.527531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerDied","Data":"2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9"} Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.528140 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.527645 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.605582 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.614022 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:42:06 crc kubenswrapper[4755]: I0320 13:42:06.751288 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:42:06 crc kubenswrapper[4755]: I0320 13:42:06.751395 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:07 crc kubenswrapper[4755]: I0320 13:42:07.235749 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" path="/var/lib/kubelet/pods/8532b92f-bed9-41b0-bf0d-99afa5703048/volumes" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.751484 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.752573 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.752714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.753726 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.753846 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" gracePeriod=600 Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.770606 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" exitCode=0 Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.770692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.771208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.771234 4755 scope.go:117] "RemoveContainer" containerID="d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.285435 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: E0320 13:42:51.286176 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286188 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286300 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286713 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289023 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289717 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pc8h7" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289956 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.304820 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.310710 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.311420 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.313847 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-t6rjl" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.320386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.320436 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.332034 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.337348 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.338321 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.342506 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pwksc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.349137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.444170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.451289 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.523850 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.543513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.610610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.632896 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.660043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.894213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.931277 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: W0320 13:42:51.932737 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf5c938_39f0_46a4_bce6_1a0cf67624ab.slice/crio-86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea WatchSource:0}: Error finding container 86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea: Status 404 returned error can't find the container with id 86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.140730 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.898468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7gpgn" event={"ID":"cdf5c938-39f0-46a4-bce6-1a0cf67624ab","Type":"ContainerStarted","Data":"86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea"} Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.900462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" event={"ID":"a3125fba-bed9-40d3-b53d-f976488e12d2","Type":"ContainerStarted","Data":"81ac5a23d30dd2baf4a9ed224ad9946bb30b65b3721cfd349352adbdc4615c64"} Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.902035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" event={"ID":"f3b802e1-c690-4817-91cf-d721cbfae51c","Type":"ContainerStarted","Data":"28520456b43b76327fa3d35665c454163acb8ca94150f2a252c25d13e93b0e8b"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.923574 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7gpgn" event={"ID":"cdf5c938-39f0-46a4-bce6-1a0cf67624ab","Type":"ContainerStarted","Data":"204e46f0d6136ec475dd6ae41242dbe77978089d71f7a352dbbd7f27b9df0ef3"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.925716 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" event={"ID":"a3125fba-bed9-40d3-b53d-f976488e12d2","Type":"ContainerStarted","Data":"1c07257661096886d39e7b410e72c58fd60c4e93df1eab568b720e3c54f848cb"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.927508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" event={"ID":"f3b802e1-c690-4817-91cf-d721cbfae51c","Type":"ContainerStarted","Data":"e238cdd2c82235f75eb7c47597f59cc4ca556ee48295f424497a94ef17e9b326"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.927736 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.969562 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7gpgn" podStartSLOduration=1.39869907 podStartE2EDuration="4.969519952s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:51.934255129 +0000 UTC m=+751.532187658" lastFinishedPulling="2026-03-20 13:42:55.505076011 +0000 UTC m=+755.103008540" observedRunningTime="2026-03-20 13:42:55.944967822 +0000 UTC m=+755.542900381" watchObservedRunningTime="2026-03-20 13:42:55.969519952 +0000 UTC m=+755.567452521" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.972796 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" podStartSLOduration=1.375408532 podStartE2EDuration="4.972773421s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:51.907912559 +0000 UTC m=+751.505845088" lastFinishedPulling="2026-03-20 13:42:55.505277408 +0000 UTC m=+755.103209977" observedRunningTime="2026-03-20 13:42:55.966820218 +0000 UTC m=+755.564752757" watchObservedRunningTime="2026-03-20 13:42:55.972773421 +0000 UTC m=+755.570706000" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.983493 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" podStartSLOduration=1.563895821 podStartE2EDuration="4.983463123s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:52.149888277 +0000 UTC m=+751.747820806" lastFinishedPulling="2026-03-20 13:42:55.569455539 +0000 UTC m=+755.167388108" observedRunningTime="2026-03-20 13:42:55.981098358 +0000 UTC m=+755.579030897" watchObservedRunningTime="2026-03-20 13:42:55.983463123 +0000 UTC m=+755.581395652" Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.984254 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985481 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" containerID="cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985531 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" containerID="cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985601 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985809 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" containerID="cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985887 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" containerID="cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985979 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" containerID="cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.986061 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" containerID="cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" gracePeriod=30 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.053175 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" containerID="cri-o://e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" gracePeriod=30 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.347117 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.349870 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-acl-logging/0.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.350463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-controller/0.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.351064 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.433071 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dgqtf"] Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.434061 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.434269 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.434430 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.434589 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.435227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435400 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.435598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435818 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.436176 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.436348 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.436887 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.436976 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437056 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437156 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437295 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kubecfg-setup" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437357 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kubecfg-setup" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437424 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437713 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437793 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437888 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438140 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438229 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438304 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438374 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438447 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438514 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438577 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438642 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438748 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438822 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.439117 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439207 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439408 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439796 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.441960 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473869 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473968 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474114 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474520 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474598 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474628 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474646 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474687 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash" (OuterVolumeSpecName: "host-slash") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474844 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474908 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474932 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474952 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475026 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475104 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475107 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475222 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475257 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475281 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475295 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475356 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475367 4755 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475375 4755 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475384 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475393 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475402 4755 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475410 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475165 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475188 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475211 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475439 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475585 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log" (OuterVolumeSpecName: "node-log") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475610 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket" (OuterVolumeSpecName: "log-socket") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.477383 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.479598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.486519 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.487618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b" (OuterVolumeSpecName: "kube-api-access-jlq8b") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "kube-api-access-jlq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.498168 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576473 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577189 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577352 4755 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577374 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577391 4755 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577409 4755 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577428 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577444 4755 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577462 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577479 4755 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577497 4755 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577513 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577532 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577551 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577567 4755 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577620 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577682 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577763 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577806 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578061 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.584716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.601633 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.665998 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.760464 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990463 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7d5628-1936-4039-86ee-97de2cf80ad6" containerID="8093898e5298d04a4cfbe84857e7a5d3b869d75ee6e76431d6f3187ef1e83f01" exitCode=0 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990555 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerDied","Data":"8093898e5298d04a4cfbe84857e7a5d3b869d75ee6e76431d6f3187ef1e83f01"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990619 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"5d41d05dc730594ae7f0ab6ae8dc2d4fe89e05bfb580d459973d1cb0d08179c7"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.993950 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996642 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996743 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" exitCode=2 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996856 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996960 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.998318 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.998766 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.002832 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.010874 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-acl-logging/0.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.011595 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-controller/0.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013578 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013616 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013625 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013636 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013631 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013648 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013710 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013724 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" exitCode=143 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013735 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" exitCode=143 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013754 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013772 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013824 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013838 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013844 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013854 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013860 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013865 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013871 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013876 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013881 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013886 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013901 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013910 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013915 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013920 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013926 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013931 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013831 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013937 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014064 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014081 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014089 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014144 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014153 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014161 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014168 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014176 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014184 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014191 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014199 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014206 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014215 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"c5dfe5e0ba9e4e073084c039346a869cdace2560fac63c02b23de7cad0ed5e4a"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014237 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014247 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014254 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014264 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014271 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014278 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014286 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014292 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014299 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014306 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.050582 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.068874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.075270 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.079195 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.105626 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.123315 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.180309 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.209206 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.251073 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.272905 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.290348 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.304978 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.324198 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.325083 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.325232 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.325318 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.326185 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326262 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326334 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.326918 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326992 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327042 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.327780 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327829 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327858 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.328271 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328353 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328496 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.328941 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328994 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329028 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.329503 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329575 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329600 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.330311 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.330349 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.330374 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.331296 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331336 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331364 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.331732 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331810 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331915 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.332729 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.332769 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333288 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333344 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333771 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333807 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334285 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334913 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334938 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.335788 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.335825 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336338 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336366 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336810 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336840 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.337400 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.337423 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338346 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338388 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338883 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338911 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.339225 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.339268 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340070 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340096 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340511 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340623 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341277 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341319 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341770 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341801 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342278 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342322 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342935 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.343008 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344116 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344148 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344490 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344519 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.345466 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.345505 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346025 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346053 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346708 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346738 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347178 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347231 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347501 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347526 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347802 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347827 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348069 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348092 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348348 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348375 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348848 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348872 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.349222 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"eaa68f4520836a4aa1778f4602112733625a44814fbca559bce5631d00337cf0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021570 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"4d265715dc08d2eed66fade7a098da657c8066df17249cd01550a7f350e1bcde"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"b35b9e9adbb3c6a19448f21171b09caca902d53d0ec07f326d04aa50976acf43"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"ff39e035be42fdc2b822e029f26c27a557fa07db1d50f323d4994c26e4437030"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"dcafc946601b25fdb92ca9369d3269fbfca344a83bbaac7be5d0e25dd62d1cf0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021638 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"13304c8e375660819ed03ac2a12bd87812f675185fb96acf6676a6efc651baa0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.022735 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.239220 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" path="/var/lib/kubelet/pods/e0de398a-6f32-4b1c-a840-10ff45da7251/volumes" Mar 20 13:43:06 crc kubenswrapper[4755]: I0320 13:43:06.056190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"b7bac5a98f64044d181da2907d2af7f15127a4bd13e1dec50169e6fa1db2f2fd"} Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.091231 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"a1627074f67d849a42402a3cc17a04efa9127c57322c6c903d4f2e418496bdfb"} Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092035 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092044 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.123581 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" podStartSLOduration=7.123559901 podStartE2EDuration="7.123559901s" podCreationTimestamp="2026-03-20 13:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:08.11804735 +0000 UTC m=+767.715979899" watchObservedRunningTime="2026-03-20 13:43:08.123559901 +0000 UTC m=+767.721492430" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.126277 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.127547 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:15 crc kubenswrapper[4755]: I0320 13:43:15.226062 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:15 crc kubenswrapper[4755]: E0320 13:43:15.227077 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:43:22 crc kubenswrapper[4755]: I0320 13:43:22.238697 4755 scope.go:117] "RemoveContainer" containerID="48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63" Mar 20 13:43:26 crc kubenswrapper[4755]: I0320 13:43:26.225470 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:27 crc kubenswrapper[4755]: I0320 13:43:27.237049 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:27 crc kubenswrapper[4755]: I0320 13:43:27.237737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"f2b63f719c6f77c2b644c8e74f4be0a1dd2a972d78a6d1db6619be3ae9203011"} Mar 20 13:43:31 crc kubenswrapper[4755]: I0320 13:43:31.790699 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.568600 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.571125 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.575007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.585621 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817192 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.818029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.818122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.845511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.955096 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.212972 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.351428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerStarted","Data":"9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21"} Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.351506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerStarted","Data":"71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a"} Mar 20 13:43:43 crc kubenswrapper[4755]: I0320 13:43:43.358035 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21" exitCode=0 Mar 20 13:43:43 crc kubenswrapper[4755]: I0320 13:43:43.358094 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21"} Mar 20 13:43:46 crc kubenswrapper[4755]: I0320 13:43:46.381132 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="607b709b941a51438e8f0f347993c872dd32eb89b9b069b9bbe07168c2adce9a" exitCode=0 Mar 20 13:43:46 crc kubenswrapper[4755]: I0320 13:43:46.381200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"607b709b941a51438e8f0f347993c872dd32eb89b9b069b9bbe07168c2adce9a"} Mar 20 13:43:47 crc kubenswrapper[4755]: I0320 13:43:47.394036 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="f51e02d9a7299d98f804d8510afd259a39e50bf730e4bab151a53197c6e45525" exitCode=0 Mar 20 13:43:47 crc kubenswrapper[4755]: I0320 13:43:47.394158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"f51e02d9a7299d98f804d8510afd259a39e50bf730e4bab151a53197c6e45525"} Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.713427 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.815815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.816078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.816274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.817591 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle" (OuterVolumeSpecName: "bundle") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.824838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm" (OuterVolumeSpecName: "kube-api-access-bxljm") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "kube-api-access-bxljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.829444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util" (OuterVolumeSpecName: "util") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918688 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918751 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918773 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a"} Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412065 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412093 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.223943 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224646 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224677 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224704 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="pull" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224710 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="pull" Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224720 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="util" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224726 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="util" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224838 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.225440 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.233033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-g9tlc" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.235860 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.236685 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.238411 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.357233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.458627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.487026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.546742 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.868988 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:52 crc kubenswrapper[4755]: I0320 13:43:52.436194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" event={"ID":"93adf7be-d696-48e2-b6d5-af27b19b24e3","Type":"ContainerStarted","Data":"acaa4643d8798df54cd0b351bd4120eb64693e4fe05b362ad5b47b4cbf3793d2"} Mar 20 13:43:54 crc kubenswrapper[4755]: I0320 13:43:54.449442 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" event={"ID":"93adf7be-d696-48e2-b6d5-af27b19b24e3","Type":"ContainerStarted","Data":"c9d720dc57f1ab407eb947def07c09c11b539e171ec406154f73dcaf7d0ffe53"} Mar 20 13:43:54 crc kubenswrapper[4755]: I0320 13:43:54.476960 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" podStartSLOduration=1.172308329 podStartE2EDuration="3.476933319s" podCreationTimestamp="2026-03-20 13:43:51 +0000 UTC" firstStartedPulling="2026-03-20 13:43:51.881813893 +0000 UTC m=+811.479746422" lastFinishedPulling="2026-03-20 13:43:54.186438883 +0000 UTC m=+813.784371412" observedRunningTime="2026-03-20 13:43:54.473893026 +0000 UTC m=+814.071825565" watchObservedRunningTime="2026-03-20 13:43:54.476933319 +0000 UTC m=+814.074865878" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.145425 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.146970 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.149893 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.150295 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.150728 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.160450 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.185572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.287530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.321633 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.505196 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.941289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.296293 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.300936 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.306460 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-d78x9" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.315137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.336710 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.337875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.341754 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.350015 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.364909 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dspfd"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.365967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404279 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404299 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.442067 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.443027 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.447047 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.447364 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.448332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vwnxs" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.456802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.502366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerStarted","Data":"8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee"} Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505234 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505345 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505378 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505359 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.521453 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.521469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.522527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.531013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606565 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.607823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.611215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.627756 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.647118 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.663555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.663590 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.664686 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.675775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.679335 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708828 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708860 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.710217 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.766297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.811712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.811996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812076 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.813483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.814608 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.815170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.815418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.821129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.822382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.834804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.948401 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.986599 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.990508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.073070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.201175 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:02 crc kubenswrapper[4755]: W0320 13:44:02.209095 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fdb87ca_2790_4a05_8438_d3b5ae3b78da.slice/crio-a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82 WatchSource:0}: Error finding container a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82: Status 404 returned error can't find the container with id a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82 Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.509613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8f88c4cc-78flh" event={"ID":"9fdb87ca-2790-4a05-8438-d3b5ae3b78da","Type":"ContainerStarted","Data":"b0ba923a011fb330fc1f9b322df9a726e62a33c31eb8cb98c80655a2e0a6fb99"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.509719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8f88c4cc-78flh" event={"ID":"9fdb87ca-2790-4a05-8438-d3b5ae3b78da","Type":"ContainerStarted","Data":"a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.510896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dspfd" event={"ID":"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68","Type":"ContainerStarted","Data":"53eb56592f8c80638a844cf0f7c7e67f21a4aadcafcd03edcc772ed58ef83739"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.512735 4755 generic.go:334] "Generic (PLEG): container finished" podID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerID="9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43" exitCode=0 Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.512835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerDied","Data":"9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.513617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"bef119c37e5ad93b35f2a2adc96dd82a2608d0fe70e2a5eeeb98ec9d894fc009"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.514247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" event={"ID":"a9993046-1fc7-4faa-a634-f91339d94c71","Type":"ContainerStarted","Data":"6bef5544c50e2b2dd9a459e7c413b33879930338956139addf3cddb53d6f1617"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.515532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" event={"ID":"36f8cd57-a5ee-4a30-b7b6-8f13d698861c","Type":"ContainerStarted","Data":"91116addc9df5afeb945720384912e548f8a1317630820f06de6e469d3480d66"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.532166 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d8f88c4cc-78flh" podStartSLOduration=1.532146005 podStartE2EDuration="1.532146005s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:02.532027442 +0000 UTC m=+822.129959981" watchObservedRunningTime="2026-03-20 13:44:02.532146005 +0000 UTC m=+822.130078534" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.759211 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.839855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"719824b6-7bd2-41dc-a61f-039b161a94d6\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.864609 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb" (OuterVolumeSpecName: "kube-api-access-6s7wb") pod "719824b6-7bd2-41dc-a61f-039b161a94d6" (UID: "719824b6-7bd2-41dc-a61f-039b161a94d6"). InnerVolumeSpecName "kube-api-access-6s7wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.941905 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529218 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerDied","Data":"8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee"} Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529260 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529271 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.819946 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.825378 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.236262 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" path="/var/lib/kubelet/pods/cb576c19-7f49-40ac-987b-5eefb5db31ce/volumes" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.538278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"add667afab7a32c46e2d27f11ae0d50f887e086ed1ee76fe95f59029e791f567"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.540311 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" event={"ID":"a9993046-1fc7-4faa-a634-f91339d94c71","Type":"ContainerStarted","Data":"f950d904399f59a00448fd1b2d442e0f328b7ab728925a2ecbcff05f89c11cef"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.544182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" event={"ID":"36f8cd57-a5ee-4a30-b7b6-8f13d698861c","Type":"ContainerStarted","Data":"b8777256907f0e08b2fab5947add70027e568375d5e358a7f1fd87e1d67c2b58"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.544301 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.547877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dspfd" event={"ID":"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68","Type":"ContainerStarted","Data":"83cbde2e0e11c48350d4dc1a0f1d0dfac6682f6b3b81fac89ef0efa3c0623a5d"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.548341 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.561871 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" podStartSLOduration=1.725631763 podStartE2EDuration="4.561848655s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:02.08208399 +0000 UTC m=+821.680016519" lastFinishedPulling="2026-03-20 13:44:04.918300882 +0000 UTC m=+824.516233411" observedRunningTime="2026-03-20 13:44:05.559010588 +0000 UTC m=+825.156943127" watchObservedRunningTime="2026-03-20 13:44:05.561848655 +0000 UTC m=+825.159781194" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.589137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dspfd" podStartSLOduration=1.4071196160000001 podStartE2EDuration="4.589116187s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:01.73262766 +0000 UTC m=+821.330560189" lastFinishedPulling="2026-03-20 13:44:04.914624231 +0000 UTC m=+824.512556760" observedRunningTime="2026-03-20 13:44:05.588418748 +0000 UTC m=+825.186351277" watchObservedRunningTime="2026-03-20 13:44:05.589116187 +0000 UTC m=+825.187048716" Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.585829 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"79a786ab22b7080c5290cefa0648f56126000f30c57dd2b8c2a8ffbe8488840d"} Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.615195 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" podStartSLOduration=1.6604807240000001 podStartE2EDuration="8.615156862s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:02.002816726 +0000 UTC m=+821.600749255" lastFinishedPulling="2026-03-20 13:44:08.957492864 +0000 UTC m=+828.555425393" observedRunningTime="2026-03-20 13:44:09.61101175 +0000 UTC m=+829.208944349" watchObservedRunningTime="2026-03-20 13:44:09.615156862 +0000 UTC m=+829.213089441" Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.616121 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" podStartSLOduration=5.65377989 podStartE2EDuration="8.616104688s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:01.966771326 +0000 UTC m=+821.564703865" lastFinishedPulling="2026-03-20 13:44:04.929096134 +0000 UTC m=+824.527028663" observedRunningTime="2026-03-20 13:44:05.616953944 +0000 UTC m=+825.214886483" watchObservedRunningTime="2026-03-20 13:44:09.616104688 +0000 UTC m=+829.214037257" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.716910 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.991796 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.991902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.000220 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.611637 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.681499 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:21 crc kubenswrapper[4755]: I0320 13:44:21.671705 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:22 crc kubenswrapper[4755]: I0320 13:44:22.340928 4755 scope.go:117] "RemoveContainer" containerID="0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc" Mar 20 13:44:29 crc kubenswrapper[4755]: I0320 13:44:29.238442 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.596092 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:37 crc kubenswrapper[4755]: E0320 13:44:37.597292 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.597311 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.597468 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.598682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.603157 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.610182 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.753996 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rb5zn" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" containerID="cri-o://664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" gracePeriod=15 Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770038 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.898997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.899130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.899245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.900857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.901253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.937606 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.232633 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.242823 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb5zn_27405a42-41b4-4521-93f3-41d029fab255/console/0.log" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.242905 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313775 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314022 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314130 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314167 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.315749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca" (OuterVolumeSpecName: "service-ca") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.316114 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config" (OuterVolumeSpecName: "console-config") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.319366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.320457 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.321006 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx" (OuterVolumeSpecName: "kube-api-access-b95lx") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "kube-api-access-b95lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.327368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.328057 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415887 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415947 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415962 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415971 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415998 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.416007 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.416016 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.493180 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:38 crc kubenswrapper[4755]: W0320 13:44:38.495151 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8346da_4c59_4f8f_9804_02ad176bc15d.slice/crio-3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40 WatchSource:0}: Error finding container 3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40: Status 404 returned error can't find the container with id 3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845042 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb5zn_27405a42-41b4-4521-93f3-41d029fab255/console/0.log" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845119 4755 generic.go:334] "Generic (PLEG): container finished" podID="27405a42-41b4-4521-93f3-41d029fab255" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" exitCode=2 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerDied","Data":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845234 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerDied","Data":"a883711492469aba5080025f39ee56d456d68c7d62a0b2da2289bad36e4ed8ea"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845290 4755 scope.go:117] "RemoveContainer" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852397 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="b0a79f5349b8f36bb4e504654c5686b5e5f4facc31a9a186a8ef5a0b760f88d5" exitCode=0 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"b0a79f5349b8f36bb4e504654c5686b5e5f4facc31a9a186a8ef5a0b760f88d5"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerStarted","Data":"3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.866944 4755 scope.go:117] "RemoveContainer" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: E0320 13:44:38.872492 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": container with ID starting with 664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea not found: ID does not exist" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.872595 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} err="failed to get container status \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": rpc error: code = NotFound desc = could not find container \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": container with ID starting with 664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea not found: ID does not exist" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.908874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.916517 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.238644 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27405a42-41b4-4521-93f3-41d029fab255" path="/var/lib/kubelet/pods/27405a42-41b4-4521-93f3-41d029fab255/volumes" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.930985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:39 crc kubenswrapper[4755]: E0320 13:44:39.931689 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.931709 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.931946 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.933275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.961680 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045258 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.147305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.147522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.173028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.271811 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.528101 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:40 crc kubenswrapper[4755]: W0320 13:44:40.539600 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae13042d_ead5_4853_8f3e_cc16f6b3515f.slice/crio-b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d WatchSource:0}: Error finding container b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d: Status 404 returned error can't find the container with id b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872292 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547"} Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d"} Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.887228 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="10e74951c5cfdeb46b551d50b4326ea04ba813a6a09c3fcd46546f102ad9b3a6" exitCode=0 Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.887297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"10e74951c5cfdeb46b551d50b4326ea04ba813a6a09c3fcd46546f102ad9b3a6"} Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.902113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.914691 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="ff1a63dbaca1263c3bef126afa3f8efe75b50982f67c8485e65c31fdd4d68c3d" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.914781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"ff1a63dbaca1263c3bef126afa3f8efe75b50982f67c8485e65c31fdd4d68c3d"} Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.919842 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.919932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} Mar 20 13:44:44 crc kubenswrapper[4755]: I0320 13:44:44.930689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} Mar 20 13:44:44 crc kubenswrapper[4755]: I0320 13:44:44.957784 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8mxq" podStartSLOduration=2.42720216 podStartE2EDuration="5.957749685s" podCreationTimestamp="2026-03-20 13:44:39 +0000 UTC" firstStartedPulling="2026-03-20 13:44:40.874345291 +0000 UTC m=+860.472277830" lastFinishedPulling="2026-03-20 13:44:44.404892806 +0000 UTC m=+864.002825355" observedRunningTime="2026-03-20 13:44:44.949912053 +0000 UTC m=+864.547844622" watchObservedRunningTime="2026-03-20 13:44:44.957749685 +0000 UTC m=+864.555682274" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.257633 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336601 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.339965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle" (OuterVolumeSpecName: "bundle") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.341082 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.345288 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw" (OuterVolumeSpecName: "kube-api-access-7jhnw") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "kube-api-access-7jhnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.352885 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util" (OuterVolumeSpecName: "util") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.443646 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.443756 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.939742 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40"} Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.940199 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.939810 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:50 crc kubenswrapper[4755]: I0320 13:44:50.272747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:50 crc kubenswrapper[4755]: I0320 13:44:50.273191 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:51 crc kubenswrapper[4755]: I0320 13:44:51.367242 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8mxq" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" probeResult="failure" output=< Mar 20 13:44:51 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:44:51 crc kubenswrapper[4755]: > Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.694619 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695394 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="util" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695410 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="util" Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695425 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="pull" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695432 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="pull" Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695449 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695457 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695585 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.696100 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699337 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699467 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.706735 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.706828 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9wmfx" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.720632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793388 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894343 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.902169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.902188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.914135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.014448 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.015249 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021304 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021426 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bgjr9" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.036388 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097838 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199487 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.204508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.204566 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.215473 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.272588 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.336459 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.571461 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: W0320 13:44:56.581935 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0274fca_6425_402c_a2aa_853b232ad93c.slice/crio-0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02 WatchSource:0}: Error finding container 0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02: Status 404 returned error can't find the container with id 0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02 Mar 20 13:44:57 crc kubenswrapper[4755]: I0320 13:44:57.016393 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" event={"ID":"f0274fca-6425-402c-a2aa-853b232ad93c","Type":"ContainerStarted","Data":"0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02"} Mar 20 13:44:57 crc kubenswrapper[4755]: I0320 13:44:57.017974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" event={"ID":"32289872-a679-4d10-8b2f-0519c713dc35","Type":"ContainerStarted","Data":"9bc843492cee7820681aa9d684db343e507a5861f4fc13d90fbab9ba6d4f0e68"} Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.047139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" event={"ID":"32289872-a679-4d10-8b2f-0519c713dc35","Type":"ContainerStarted","Data":"69561a3396b227b22bee31a3e412678dee6bd7981c81da53433ca78e25ba850d"} Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.047722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.071850 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" podStartSLOduration=1.60250275 podStartE2EDuration="5.071823171s" podCreationTimestamp="2026-03-20 13:44:55 +0000 UTC" firstStartedPulling="2026-03-20 13:44:56.279768656 +0000 UTC m=+875.877701185" lastFinishedPulling="2026-03-20 13:44:59.749089067 +0000 UTC m=+879.347021606" observedRunningTime="2026-03-20 13:45:00.066002623 +0000 UTC m=+879.663935142" watchObservedRunningTime="2026-03-20 13:45:00.071823171 +0000 UTC m=+879.669755700" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.137392 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.138797 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.141321 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.141668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.150143 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.367446 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.388769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.393082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.426940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.459092 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.480753 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.703471 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: W0320 13:45:00.712836 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d412e5_5c09_4c6d_ba8d_db4546796c70.slice/crio-88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a WatchSource:0}: Error finding container 88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a: Status 404 returned error can't find the container with id 88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.055042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerStarted","Data":"a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f"} Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.055119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerStarted","Data":"88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a"} Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.274003 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" podStartSLOduration=1.27396798 podStartE2EDuration="1.27396798s" podCreationTimestamp="2026-03-20 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:01.076882322 +0000 UTC m=+880.674814851" watchObservedRunningTime="2026-03-20 13:45:01.27396798 +0000 UTC m=+880.871900549" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.063641 4755 generic.go:334] "Generic (PLEG): container finished" podID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerID="a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.063705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerDied","Data":"a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f"} Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.114930 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.115207 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8mxq" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" containerID="cri-o://88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" gracePeriod=2 Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.532002 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.603939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.604074 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.604101 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.605151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities" (OuterVolumeSpecName: "utilities") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.612884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr" (OuterVolumeSpecName: "kube-api-access-hq9hr") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "kube-api-access-hq9hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.705884 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.705936 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.733581 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.807744 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.075902 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" exitCode=0 Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d"} Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076731 4755 scope.go:117] "RemoveContainer" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076252 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.116908 4755 scope.go:117] "RemoveContainer" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.134129 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.140444 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.146923 4755 scope.go:117] "RemoveContainer" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166032 4755 scope.go:117] "RemoveContainer" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.166528 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": container with ID starting with 88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd not found: ID does not exist" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166569 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} err="failed to get container status \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": rpc error: code = NotFound desc = could not find container \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": container with ID starting with 88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166600 4755 scope.go:117] "RemoveContainer" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.167120 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": container with ID starting with d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9 not found: ID does not exist" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167183 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} err="failed to get container status \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": rpc error: code = NotFound desc = could not find container \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": container with ID starting with d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9 not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167227 4755 scope.go:117] "RemoveContainer" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.167640 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": container with ID starting with 15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547 not found: ID does not exist" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167745 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547"} err="failed to get container status \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": rpc error: code = NotFound desc = could not find container \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": container with ID starting with 15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547 not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.237774 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" path="/var/lib/kubelet/pods/ae13042d-ead5-4853-8f3e-cc16f6b3515f/volumes" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.405047 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531119 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531158 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume" (OuterVolumeSpecName: "config-volume") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.532267 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.537171 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh" (OuterVolumeSpecName: "kube-api-access-fjqzh") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "kube-api-access-fjqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.537224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.634427 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.634491 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerDied","Data":"88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a"} Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087516 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a" Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087585 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:06 crc kubenswrapper[4755]: I0320 13:45:06.751921 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:06 crc kubenswrapper[4755]: I0320 13:45:06.752560 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.109330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" event={"ID":"f0274fca-6425-402c-a2aa-853b232ad93c","Type":"ContainerStarted","Data":"b073f93f45a830615501a5b90f151f4ca77a77502554cec9ca5c5238bcb64a95"} Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.109676 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.143119 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" podStartSLOduration=2.271597668 podStartE2EDuration="12.143092748s" podCreationTimestamp="2026-03-20 13:44:55 +0000 UTC" firstStartedPulling="2026-03-20 13:44:56.585838876 +0000 UTC m=+876.183771405" lastFinishedPulling="2026-03-20 13:45:06.457333956 +0000 UTC m=+886.055266485" observedRunningTime="2026-03-20 13:45:07.135324527 +0000 UTC m=+886.733257076" watchObservedRunningTime="2026-03-20 13:45:07.143092748 +0000 UTC m=+886.741025277" Mar 20 13:45:16 crc kubenswrapper[4755]: I0320 13:45:16.342829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.025988 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.751359 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.751434 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836514 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5l5hs"] Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836889 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-content" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836912 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-content" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836936 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836948 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836974 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836986 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.837008 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-utilities" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-utilities" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837195 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837222 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.845680 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.845871 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.847404 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850033 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qt2g7" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850087 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.858668 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943628 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943804 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943833 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.973841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6vf4n"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.974755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983066 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983564 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983730 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983894 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fb2f6" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.993360 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.994326 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.999406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045585 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045668 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045698 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045768 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045991 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.046282 4755 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.046372 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs podName:1152c78e-15f9-4826-acc3-3d7f5765db68 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:37.546353085 +0000 UTC m=+917.144285614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs") pod "frr-k8s-5l5hs" (UID: "1152c78e-15f9-4826-acc3-3d7f5765db68") : secret "frr-k8s-certs-secret" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.046480 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.056520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.067006 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.076071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.085423 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.089475 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147769 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.147951 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.148007 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist podName:839a8db3-662c-41c4-bb63-6b1027901ab5 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:37.647988061 +0000 UTC m=+917.245920590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist") pod "speaker-6vf4n" (UID: "839a8db3-662c-41c4-bb63-6b1027901ab5") : secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.154541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.157033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.160908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.161161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.167432 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.176526 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.177594 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.188470 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.306502 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.557362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.565127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.625372 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:37 crc kubenswrapper[4755]: W0320 13:45:37.632962 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490ee5e7_c0b1_4181_b7ac_86e5e61253a0.slice/crio-02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26 WatchSource:0}: Error finding container 02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26: Status 404 returned error can't find the container with id 02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26 Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.636943 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.659537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.659767 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.659851 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist podName:839a8db3-662c-41c4-bb63-6b1027901ab5 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:38.659824945 +0000 UTC m=+918.257757484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist") pod "speaker-6vf4n" (UID: "839a8db3-662c-41c4-bb63-6b1027901ab5") : secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.765778 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.863517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:37 crc kubenswrapper[4755]: W0320 13:45:37.869762 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71f1548_62b5_4a77_9655_735bafa396c8.slice/crio-524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43 WatchSource:0}: Error finding container 524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43: Status 404 returned error can't find the container with id 524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43 Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"d081085834f2957c6aa6a031fbdc25b3dc6e8284e1d3b312a5af2d26282ccc95"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"c1926c59c858fce55632d79e0f6389caba56bce2095eca782f46e6256acea58c"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.329307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.331966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" event={"ID":"490ee5e7-c0b1-4181-b7ac-86e5e61253a0","Type":"ContainerStarted","Data":"02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.333869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"fe82e7581cc8721da1697a5bf21d324aa60639e7864baf45db43acdeee5d9db4"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.354305 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qsbbn" podStartSLOduration=2.354274147 podStartE2EDuration="2.354274147s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:38.351922383 +0000 UTC m=+917.949854952" watchObservedRunningTime="2026-03-20 13:45:38.354274147 +0000 UTC m=+917.952206716" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.679535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.688090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.788272 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: W0320 13:45:38.833856 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839a8db3_662c_41c4_bb63_6b1027901ab5.slice/crio-070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14 WatchSource:0}: Error finding container 070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14: Status 404 returned error can't find the container with id 070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14 Mar 20 13:45:39 crc kubenswrapper[4755]: I0320 13:45:39.349791 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"46f6128ee887d20789e0b1476a1a3cab480f929c2b6aef166e62d3ae334a51cb"} Mar 20 13:45:39 crc kubenswrapper[4755]: I0320 13:45:39.350299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14"} Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.359225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"e59f930c84cd84e3d3023623c4734de877d54ca96977002979d975c208b57b67"} Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.360402 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.396220 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6vf4n" podStartSLOduration=4.396195169 podStartE2EDuration="4.396195169s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:40.394075631 +0000 UTC m=+919.992008160" watchObservedRunningTime="2026-03-20 13:45:40.396195169 +0000 UTC m=+919.994127698" Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.446016 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="ed53424a28d633c3e292590ed8accef6bafe2195383ff37ed03e25723c93fb40" exitCode=0 Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.446082 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"ed53424a28d633c3e292590ed8accef6bafe2195383ff37ed03e25723c93fb40"} Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.450073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" event={"ID":"490ee5e7-c0b1-4181-b7ac-86e5e61253a0","Type":"ContainerStarted","Data":"2b85f34fd7d6132829066984b3845c42624a4b339ab8d945fbf7b585b2795b83"} Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.450515 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.459165 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="ac7ec846a4acdfb04966b3d9ee1849da3c74a5dec549e78b7ee3fcd6a0a66380" exitCode=0 Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.459283 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"ac7ec846a4acdfb04966b3d9ee1849da3c74a5dec549e78b7ee3fcd6a0a66380"} Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.513002 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" podStartSLOduration=3.489145042 podStartE2EDuration="11.512972482s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="2026-03-20 13:45:37.63649189 +0000 UTC m=+917.234424419" lastFinishedPulling="2026-03-20 13:45:45.66031933 +0000 UTC m=+925.258251859" observedRunningTime="2026-03-20 13:45:46.494900435 +0000 UTC m=+926.092832964" watchObservedRunningTime="2026-03-20 13:45:47.512972482 +0000 UTC m=+927.110905021" Mar 20 13:45:48 crc kubenswrapper[4755]: I0320 13:45:48.469060 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="d4c39ae49ed20a2410ce4c58a48e2d2553860145121ddff9391e829737679c5f" exitCode=0 Mar 20 13:45:48 crc kubenswrapper[4755]: I0320 13:45:48.469143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"d4c39ae49ed20a2410ce4c58a48e2d2553860145121ddff9391e829737679c5f"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"53be76c677173dce74a8ced31f2dae6b2875ad21b7a4360bd7f85ea5468f7d47"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"307fb133cb09d3633efa9cc9e62abc7b36a901314d32fc82f18e41b671601b89"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"647da65ab737d68b0933db7cc763e0d8d81a12ead2e60418703f0b53fcd7cc6f"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484575 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"56e527f6cffc79a1659a7c5d29290a27f868a61605184138e68459ed6cacf974"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.496247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"82cfc7bd014a8eaea9b9a148bb9a767450366267f4c09ec4c1743b60797dedee"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.497062 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.497162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"783fffea87344140407fc6db504afe50d283defe3c73a9a986ef548e1338d514"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.525422 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5l5hs" podStartSLOduration=6.8687956660000005 podStartE2EDuration="14.525389377s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="2026-03-20 13:45:37.960267008 +0000 UTC m=+917.558199567" lastFinishedPulling="2026-03-20 13:45:45.616860749 +0000 UTC m=+925.214793278" observedRunningTime="2026-03-20 13:45:50.521066179 +0000 UTC m=+930.118998748" watchObservedRunningTime="2026-03-20 13:45:50.525389377 +0000 UTC m=+930.123321946" Mar 20 13:45:52 crc kubenswrapper[4755]: I0320 13:45:52.766195 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:52 crc kubenswrapper[4755]: I0320 13:45:52.806356 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:57 crc kubenswrapper[4755]: I0320 13:45:57.182999 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:57 crc kubenswrapper[4755]: I0320 13:45:57.312762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:58 crc kubenswrapper[4755]: I0320 13:45:58.792082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6vf4n" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.143322 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.145349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.149239 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.153542 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.154373 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.155029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.217814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.319122 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.345020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.477363 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.021389 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:01 crc kubenswrapper[4755]: W0320 13:46:01.032902 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47f9ab28_1218_4dcf_a989_728b9063a3e9.slice/crio-2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600 WatchSource:0}: Error finding container 2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600: Status 404 returned error can't find the container with id 2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600 Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.594519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerStarted","Data":"2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600"} Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.685841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.687770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.694423 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.694756 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cq82b" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.696330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.700453 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.750775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.854826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.874800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.021061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.507078 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:02 crc kubenswrapper[4755]: W0320 13:46:02.530311 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56977506_369a_4f65_be85_3ff2319ac213.slice/crio-cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a WatchSource:0}: Error finding container cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a: Status 404 returned error can't find the container with id cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.605424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerStarted","Data":"cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a"} Mar 20 13:46:03 crc kubenswrapper[4755]: I0320 13:46:03.614833 4755 generic.go:334] "Generic (PLEG): container finished" podID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerID="cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f" exitCode=0 Mar 20 13:46:03 crc kubenswrapper[4755]: I0320 13:46:03.614948 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerDied","Data":"cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f"} Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.118294 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.316072 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"47f9ab28-1218-4dcf-a989-728b9063a3e9\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.323002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f" (OuterVolumeSpecName: "kube-api-access-nvh4f") pod "47f9ab28-1218-4dcf-a989-728b9063a3e9" (UID: "47f9ab28-1218-4dcf-a989-728b9063a3e9"). InnerVolumeSpecName "kube-api-access-nvh4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.417606 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634193 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerDied","Data":"2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600"} Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634300 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634311 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.641334 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.196925 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.204032 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.260879 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:06 crc kubenswrapper[4755]: E0320 13:46:06.261243 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261261 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261405 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261949 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.270967 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.332263 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.434846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.469695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.583760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.751614 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.752229 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.752292 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.753329 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.753414 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" gracePeriod=600 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.021535 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:07 crc kubenswrapper[4755]: W0320 13:46:07.022689 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58ff15e_f098_460d_ada4_3bdd990125ba.slice/crio-e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb WatchSource:0}: Error finding container e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb: Status 404 returned error can't find the container with id e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.237165 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" path="/var/lib/kubelet/pods/6af9d427-765f-4d25-9603-e0b39103e2cc/volumes" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.653739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98hpf" event={"ID":"b58ff15e-f098-460d-ada4-3bdd990125ba","Type":"ContainerStarted","Data":"0dac578f595955d60f5e85a5f64f072755d5f2116d535d213c2c9c141d43d11a"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.654251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98hpf" event={"ID":"b58ff15e-f098-460d-ada4-3bdd990125ba","Type":"ContainerStarted","Data":"e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.656379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerStarted","Data":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.656571 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zspb5" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" containerID="cri-o://50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" gracePeriod=2 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660702 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" exitCode=0 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660777 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660801 4755 scope.go:117] "RemoveContainer" containerID="993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.680055 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-98hpf" podStartSLOduration=1.62983492 podStartE2EDuration="1.680026105s" podCreationTimestamp="2026-03-20 13:46:06 +0000 UTC" firstStartedPulling="2026-03-20 13:46:07.026921608 +0000 UTC m=+946.624854137" lastFinishedPulling="2026-03-20 13:46:07.077112783 +0000 UTC m=+946.675045322" observedRunningTime="2026-03-20 13:46:07.678623318 +0000 UTC m=+947.276555857" watchObservedRunningTime="2026-03-20 13:46:07.680026105 +0000 UTC m=+947.277958664" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.718091 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zspb5" podStartSLOduration=2.6368764000000002 podStartE2EDuration="6.7180589s" podCreationTimestamp="2026-03-20 13:46:01 +0000 UTC" firstStartedPulling="2026-03-20 13:46:02.536494534 +0000 UTC m=+942.134427073" lastFinishedPulling="2026-03-20 13:46:06.617677044 +0000 UTC m=+946.215609573" observedRunningTime="2026-03-20 13:46:07.714865564 +0000 UTC m=+947.312798123" watchObservedRunningTime="2026-03-20 13:46:07.7180589 +0000 UTC m=+947.315991459" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.770260 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.166229 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.364680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"56977506-369a-4f65-be85-3ff2319ac213\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.385177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8" (OuterVolumeSpecName: "kube-api-access-l6xp8") pod "56977506-369a-4f65-be85-3ff2319ac213" (UID: "56977506-369a-4f65-be85-3ff2319ac213"). InnerVolumeSpecName "kube-api-access-l6xp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.467557 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677560 4755 generic.go:334] "Generic (PLEG): container finished" podID="56977506-369a-4f65-be85-3ff2319ac213" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" exitCode=0 Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677742 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerDied","Data":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.678481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerDied","Data":"cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a"} Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.678519 4755 scope.go:117] "RemoveContainer" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.723052 4755 scope.go:117] "RemoveContainer" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: E0320 13:46:08.725525 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": container with ID starting with 50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f not found: ID does not exist" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.725586 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} err="failed to get container status \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": rpc error: code = NotFound desc = could not find container \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": container with ID starting with 50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f not found: ID does not exist" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.734809 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.743733 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:09 crc kubenswrapper[4755]: I0320 13:46:09.239427 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56977506-369a-4f65-be85-3ff2319ac213" path="/var/lib/kubelet/pods/56977506-369a-4f65-be85-3ff2319ac213/volumes" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.664931 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:13 crc kubenswrapper[4755]: E0320 13:46:13.666832 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.666902 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.667342 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.670003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.682376 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755718 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755836 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755880 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.857727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.859013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.883750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.038193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.496788 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.730682 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" exitCode=0 Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.730872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f"} Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.733668 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"4a8482d9f40332c626354ce360719a45a8861fce5e40d0043025c4c874f4eed4"} Mar 20 13:46:15 crc kubenswrapper[4755]: I0320 13:46:15.744803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.584643 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.585020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.636232 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.756181 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" exitCode=0 Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.756274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.809419 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.253814 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.256357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.259514 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413548 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.414191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.414568 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.441203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.626536 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.780436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.809197 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2c4n" podStartSLOduration=2.329643205 podStartE2EDuration="4.809179352s" podCreationTimestamp="2026-03-20 13:46:13 +0000 UTC" firstStartedPulling="2026-03-20 13:46:14.732983443 +0000 UTC m=+954.330915972" lastFinishedPulling="2026-03-20 13:46:17.21251958 +0000 UTC m=+956.810452119" observedRunningTime="2026-03-20 13:46:17.804090433 +0000 UTC m=+957.402022962" watchObservedRunningTime="2026-03-20 13:46:17.809179352 +0000 UTC m=+957.407111881" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.898120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.289544 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.291072 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.293803 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mrwws" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.303723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.457992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458678 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.484283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.669201 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789280 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e" exitCode=0 Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e"} Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"2576fac110f0f8c9949cb9355f303a460a65145a8245ced3044dd2d52050d5d9"} Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.914553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: W0320 13:46:18.921169 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e2672d_2bea_46ce_961b_58decbe4a9c4.slice/crio-48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b WatchSource:0}: Error finding container 48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b: Status 404 returned error can't find the container with id 48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.798536 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="036824045df862fed19c828c5797abf74f91ee509bc715ee570a02cfd88be198" exitCode=0 Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.798607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"036824045df862fed19c828c5797abf74f91ee509bc715ee570a02cfd88be198"} Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.800286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerStarted","Data":"48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b"} Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.803315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e"} Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.811756 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e" exitCode=0 Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.811864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e"} Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.814751 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="17772537607eeddbda658477b19b56ca7a81d21afc8bf3352bc4081f64d0344e" exitCode=0 Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.814804 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"17772537607eeddbda658477b19b56ca7a81d21afc8bf3352bc4081f64d0344e"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.828235 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="9b8c9133d316b84d95857a5bbf000e1aec6ee4c309b68cb9c7bfd99bf8be3084" exitCode=0 Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.828413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"9b8c9133d316b84d95857a5bbf000e1aec6ee4c309b68cb9c7bfd99bf8be3084"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.834109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.866075 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbxwj" podStartSLOduration=2.228820443 podStartE2EDuration="4.86605268s" podCreationTimestamp="2026-03-20 13:46:17 +0000 UTC" firstStartedPulling="2026-03-20 13:46:18.792209886 +0000 UTC m=+958.390142415" lastFinishedPulling="2026-03-20 13:46:21.429442083 +0000 UTC m=+961.027374652" observedRunningTime="2026-03-20 13:46:21.863771449 +0000 UTC m=+961.461703988" watchObservedRunningTime="2026-03-20 13:46:21.86605268 +0000 UTC m=+961.463985209" Mar 20 13:46:22 crc kubenswrapper[4755]: I0320 13:46:22.455286 4755 scope.go:117] "RemoveContainer" containerID="5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.186970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329751 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329779 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.330327 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle" (OuterVolumeSpecName: "bundle") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.336464 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s" (OuterVolumeSpecName: "kube-api-access-6rd8s") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "kube-api-access-6rd8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.363240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util" (OuterVolumeSpecName: "util") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431246 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431284 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431293 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b"} Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851250 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851264 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.040092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.040486 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.095546 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.937458 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.444138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.445099 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2c4n" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" containerID="cri-o://008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" gracePeriod=2 Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.627446 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.629404 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.707535 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.812759 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887115 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" exitCode=0 Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"4a8482d9f40332c626354ce360719a45a8861fce5e40d0043025c4c874f4eed4"} Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887362 4755 scope.go:117] "RemoveContainer" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.905630 4755 scope.go:117] "RemoveContainer" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.936343 4755 scope.go:117] "RemoveContainer" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.943120 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.961717 4755 scope.go:117] "RemoveContainer" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.962476 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": container with ID starting with 008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f not found: ID does not exist" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.962543 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} err="failed to get container status \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": rpc error: code = NotFound desc = could not find container \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": container with ID starting with 008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.962587 4755 scope.go:117] "RemoveContainer" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.963313 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": container with ID starting with a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad not found: ID does not exist" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.963360 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} err="failed to get container status \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": rpc error: code = NotFound desc = could not find container \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": container with ID starting with a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.963387 4755 scope.go:117] "RemoveContainer" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.964235 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": container with ID starting with 7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f not found: ID does not exist" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.964277 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f"} err="failed to get container status \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": rpc error: code = NotFound desc = could not find container \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": container with ID starting with 7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999162 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999285 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.003479 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities" (OuterVolumeSpecName: "utilities") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.007721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph" (OuterVolumeSpecName: "kube-api-access-xt9ph") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "kube-api-access-xt9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.055781 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100832 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100867 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100877 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.225147 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.231753 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.246306 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" path="/var/lib/kubelet/pods/dec99880-d50d-4204-a8d5-0079e4175e5c/volumes" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647133 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.647752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="pull" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647846 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="pull" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.647918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="util" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647970 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="util" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648024 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-utilities" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-utilities" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648136 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648191 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648248 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-content" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648297 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-content" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648350 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648401 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648561 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648624 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.649578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.665913 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.721256 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.723102 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.726690 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rvkg9" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.745038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.930691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.931257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.931330 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.950060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.950113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.965589 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.038474 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.283245 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:30 crc kubenswrapper[4755]: W0320 13:46:30.290460 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd61b3b_557c_4bfc_9ca3_784180e34cf4.slice/crio-f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c WatchSource:0}: Error finding container f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c: Status 404 returned error can't find the container with id f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.597484 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:30 crc kubenswrapper[4755]: W0320 13:46:30.609247 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode837c2d9_26ab_47a1_b48a_44f28fc2e2a6.slice/crio-9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540 WatchSource:0}: Error finding container 9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540: Status 404 returned error can't find the container with id 9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540 Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911827 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359" exitCode=0 Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911921 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359"} Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerStarted","Data":"f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c"} Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.913930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" event={"ID":"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6","Type":"ContainerStarted","Data":"9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540"} Mar 20 13:46:31 crc kubenswrapper[4755]: I0320 13:46:31.933027 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7" exitCode=0 Mar 20 13:46:31 crc kubenswrapper[4755]: I0320 13:46:31.934250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7"} Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.243196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.243438 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbxwj" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" containerID="cri-o://fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" gracePeriod=2 Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.943285 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" exitCode=0 Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.943350 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.351725 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439319 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.440604 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities" (OuterVolumeSpecName: "utilities") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.447873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs" (OuterVolumeSpecName: "kube-api-access-6d4xs") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "kube-api-access-6d4xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.480377 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541495 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541526 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541539 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.969018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerStarted","Data":"855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972288 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"2576fac110f0f8c9949cb9355f303a460a65145a8245ced3044dd2d52050d5d9"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972348 4755 scope.go:117] "RemoveContainer" containerID="fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972496 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.976497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" event={"ID":"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6","Type":"ContainerStarted","Data":"c0d02e673f830c04ff018db3b765586e7afdf71e7fee50fab41ea873a737d1bf"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.976681 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.992132 4755 scope.go:117] "RemoveContainer" containerID="758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.012349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lxv97" podStartSLOduration=2.391152284 podStartE2EDuration="7.012316955s" podCreationTimestamp="2026-03-20 13:46:29 +0000 UTC" firstStartedPulling="2026-03-20 13:46:30.913810428 +0000 UTC m=+970.511742947" lastFinishedPulling="2026-03-20 13:46:35.534975089 +0000 UTC m=+975.132907618" observedRunningTime="2026-03-20 13:46:35.996492444 +0000 UTC m=+975.594424973" watchObservedRunningTime="2026-03-20 13:46:36.012316955 +0000 UTC m=+975.610249484" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.013938 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.019862 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.020978 4755 scope.go:117] "RemoveContainer" containerID="77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.041317 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" podStartSLOduration=2.048812521 podStartE2EDuration="7.041294763s" podCreationTimestamp="2026-03-20 13:46:29 +0000 UTC" firstStartedPulling="2026-03-20 13:46:30.612622345 +0000 UTC m=+970.210554874" lastFinishedPulling="2026-03-20 13:46:35.605104577 +0000 UTC m=+975.203037116" observedRunningTime="2026-03-20 13:46:36.040386439 +0000 UTC m=+975.638319008" watchObservedRunningTime="2026-03-20 13:46:36.041294763 +0000 UTC m=+975.639227292" Mar 20 13:46:37 crc kubenswrapper[4755]: I0320 13:46:37.232982 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47277032-9d6e-4e0e-81a1-42a899786245" path="/var/lib/kubelet/pods/47277032-9d6e-4e0e-81a1-42a899786245/volumes" Mar 20 13:46:39 crc kubenswrapper[4755]: I0320 13:46:39.967177 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:39 crc kubenswrapper[4755]: I0320 13:46:39.967867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.044424 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.047006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.141080 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:42 crc kubenswrapper[4755]: I0320 13:46:42.447399 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:42 crc kubenswrapper[4755]: I0320 13:46:42.448182 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lxv97" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" containerID="cri-o://855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" gracePeriod=2 Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.069092 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" exitCode=0 Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.069137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b"} Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.384709 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.564016 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.565710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.565955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.567191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities" (OuterVolumeSpecName: "utilities") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.574408 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h" (OuterVolumeSpecName: "kube-api-access-66k9h") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "kube-api-access-66k9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.623547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668293 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668354 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668379 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.089551 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c"} Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.089773 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.090396 4755 scope.go:117] "RemoveContainer" containerID="855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.109475 4755 scope.go:117] "RemoveContainer" containerID="cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.124181 4755 scope.go:117] "RemoveContainer" containerID="7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.187009 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.194380 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:45 crc kubenswrapper[4755]: I0320 13:46:45.232593 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" path="/var/lib/kubelet/pods/2cd61b3b-557c-4bfc-9ca3-784180e34cf4/volumes" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.361018 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.361970 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.361990 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362010 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362032 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362039 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362051 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362071 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362078 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362092 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362099 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362239 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362261 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.364602 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wbx7b" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.380595 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.381427 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.384354 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rk298" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.396167 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.406674 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.407613 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.420346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r9t7k" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.422840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.427339 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.428348 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.436841 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xb8qd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.441079 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.447354 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.448425 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.450271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wqxv7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.457572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.465665 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.486495 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.487630 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.489703 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.490391 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.490973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ttdmd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.514678 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l7q5f" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.514880 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.516028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.516133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.576150 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.589744 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.600335 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.601448 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.613073 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f9ccv" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.613260 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617116 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617288 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.646955 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.648362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654179 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-56pln" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654550 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654568 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.658872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.667547 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.668473 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.676076 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nkxw2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.684013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.697381 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.698369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.699519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.701151 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hk896" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.724153 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.724199 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:46:59.224178837 +0000 UTC m=+998.822111366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.711568 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.730703 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.773603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.777828 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.779698 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.783131 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lvnsc" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786963 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.793753 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.800742 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.801765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.805122 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.805173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nwt2m" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.820811 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.822310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.824524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cgp2k" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825436 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.833391 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.841980 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.861382 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.874616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.887859 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.892975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.897949 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.899041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.900569 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qk8zk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.902997 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.903347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n4kd2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.918798 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927082 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.928980 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.943602 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.944577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.956087 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.957601 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-69t7b" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.967057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.975975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.978365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.990681 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.992664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.999424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-v52dq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.012235 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.012948 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.023999 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.036460 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.046488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.049137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.050219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.064855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.072341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.079734 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.109033 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.116091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gdw6q" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.125716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134228 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.136041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.137910 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.137972 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:46:59.637951215 +0000 UTC m=+999.235883744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.145273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.162728 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.172303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.174415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.219428 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.236012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.236455 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.236502 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.236484795 +0000 UTC m=+999.834417324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.257354 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.258406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.267748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.268581 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.268695 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.272842 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.272851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-69lnm" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.295401 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.332204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.337532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.349128 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.357970 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.360301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.368257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.369194 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lbhd4" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.373287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.404123 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.405164 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.405612 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.408305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.408448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.409137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7dlnc" Mar 20 13:46:59 crc kubenswrapper[4755]: W0320 13:46:59.425447 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a22a8d8_92cd_4177_a597_9c659673392c.slice/crio-5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4 WatchSource:0}: Error finding container 5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4: Status 404 returned error can't find the container with id 5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4 Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.428029 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.451414 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.472306 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.516833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.544171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.595782 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.602919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.613451 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.646003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646147 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646216 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.146186962 +0000 UTC m=+999.744119481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646577 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646609 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.146598303 +0000 UTC m=+999.744530832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646678 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646701 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.646695186 +0000 UTC m=+1000.244627715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.733599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.795071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.848003 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.972729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:59 crc kubenswrapper[4755]: W0320 13:46:59.983149 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552e0390_e86e_4972_bf6f_a4570e6b6f81.slice/crio-ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff WatchSource:0}: Error finding container ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff: Status 404 returned error can't find the container with id ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.085635 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.102126 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.111213 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc80030a_428b_4643_9d8d_2b0e9c873060.slice/crio-a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f WatchSource:0}: Error finding container a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f: Status 404 returned error can't find the container with id a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.158498 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.158610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.158757 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.158817 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:01.158797278 +0000 UTC m=+1000.756729807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.159244 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.159331 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:01.159308191 +0000 UTC m=+1000.757240710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.198778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.208434 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f51051e_6a90_4582_a411_28a106c37118.slice/crio-b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750 WatchSource:0}: Error finding container b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750: Status 404 returned error can't find the container with id b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.231614 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.241565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" event={"ID":"5a83ca27-3334-4aac-9129-5635d3af0714","Type":"ContainerStarted","Data":"9912ebc9c17c1fc59076b1178f5fa6620b1b1314c2609f6fa2c634f98c212c6a"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.243116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" event={"ID":"552e0390-e86e-4972-bf6f-a4570e6b6f81","Type":"ContainerStarted","Data":"ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff"} Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.243608 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c037b9_79d2_45ea_9b92_66e50eb20e6b.slice/crio-a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d WatchSource:0}: Error finding container a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d: Status 404 returned error can't find the container with id a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.261347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.261686 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.261749 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:02.261722227 +0000 UTC m=+1001.859654756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.263114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" event={"ID":"00fc80a4-4ea8-4f61-8795-6473f0adc40a","Type":"ContainerStarted","Data":"c82bf96563ace2a79cfd0edc367ecb82d6c8b7bd05f818afcbed05237bcff328"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.266849 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" event={"ID":"7f51051e-6a90-4582-a411-28a106c37118","Type":"ContainerStarted","Data":"b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.270072 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.284424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" event={"ID":"21c9358d-2c84-4c38-9c91-8ca3dad4dab7","Type":"ContainerStarted","Data":"31f726768bac25884e79e07c0eebcdc8bdfdffb1c222c432046d6d7b0aa6cbe3"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.287522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" event={"ID":"4c1ba89a-aed6-4245-8411-4d1fecac2500","Type":"ContainerStarted","Data":"fab89acb33673d7163cf70dd5e20b02a7c5cfe8640d4de7f772733bc5c58d3cb"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.290193 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" event={"ID":"52210224-8989-4e16-8fdf-4ea3a8211b10","Type":"ContainerStarted","Data":"f1e539c8860b9e1818c85b8fd992e545698bb5822d4096f90949ff0ae808c317"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.292311 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" event={"ID":"bc80030a-428b-4643-9d8d-2b0e9c873060","Type":"ContainerStarted","Data":"a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.294472 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" event={"ID":"fe4ddc70-f382-4b32-8879-122023b45438","Type":"ContainerStarted","Data":"8c3c232ec66ab427da213fe84bb964a64e40f8c0b5af7bf549ab7f717fa8d37e"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.296044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" event={"ID":"3a22a8d8-92cd-4177-a597-9c659673392c","Type":"ContainerStarted","Data":"5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.347689 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.352414 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ea5d65_7221_4b25_9025_7a5c31bae331.slice/crio-5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5 WatchSource:0}: Error finding container 5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5: Status 404 returned error can't find the container with id 5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5 Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.354461 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b32bae_fa65_45aa_a8db_b46a7351ee2c.slice/crio-f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326 WatchSource:0}: Error finding container f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326: Status 404 returned error can't find the container with id f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.354547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.361236 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod358d4809_db3b_4468_8c8c_4ffbedc0ec89.slice/crio-7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e WatchSource:0}: Error finding container 7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e: Status 404 returned error can't find the container with id 7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.364814 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.366718 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wz24w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-r8jn7_openstack-operators(358d4809-db3b-4468-8c8c-4ffbedc0ec89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.367847 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.423997 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.431985 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.434822 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc93761d_ecc1_4179_8287_40fd76ba5ad1.slice/crio-2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1 WatchSource:0}: Error finding container 2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1: Status 404 returned error can't find the container with id 2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1 Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.435311 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dfba7a_f5fa_45bc_a187_91ddce4da2d6.slice/crio-8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f WatchSource:0}: Error finding container 8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f: Status 404 returned error can't find the container with id 8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.437672 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.442073 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5x78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-hpmzq_openstack-operators(bc93761d-ecc1-4179-8287-40fd76ba5ad1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.443439 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.444277 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b4b9ba_026c_4fd7_a57d_545e62b6981e.slice/crio-e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c WatchSource:0}: Error finding container e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c: Status 404 returned error can't find the container with id e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.446789 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgm46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-bfh6x_openstack-operators(14b4b9ba-026c-4fd7-a57d-545e62b6981e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.447981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.604358 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.619116 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88862bd4_c890_447c_b4ee_b9cb1a4928e8.slice/crio-f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4 WatchSource:0}: Error finding container f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4: Status 404 returned error can't find the container with id f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.667442 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.667676 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.668067 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:02.668032421 +0000 UTC m=+1002.265964950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.176019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.176139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176237 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176339 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:03.176314909 +0000 UTC m=+1002.774247438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176343 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176457 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:03.176430982 +0000 UTC m=+1002.774363601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.351612 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" event={"ID":"bc93761d-ecc1-4179-8287-40fd76ba5ad1","Type":"ContainerStarted","Data":"2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.354844 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.405080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" event={"ID":"14b4b9ba-026c-4fd7-a57d-545e62b6981e","Type":"ContainerStarted","Data":"e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.406449 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.410689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" event={"ID":"b3c037b9-79d2-45ea-9b92-66e50eb20e6b","Type":"ContainerStarted","Data":"a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.412740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" event={"ID":"a1b32bae-fa65-45aa-a8db-b46a7351ee2c","Type":"ContainerStarted","Data":"f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.420840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" event={"ID":"72ea5d65-7221-4b25-9025-7a5c31bae331","Type":"ContainerStarted","Data":"5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.425984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" event={"ID":"88862bd4-c890-447c-b4ee-b9cb1a4928e8","Type":"ContainerStarted","Data":"f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.432332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" event={"ID":"358d4809-db3b-4468-8c8c-4ffbedc0ec89","Type":"ContainerStarted","Data":"7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.434273 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.441972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" event={"ID":"1aaef0d5-16fe-4c61-82d5-660f29168171","Type":"ContainerStarted","Data":"18344850f9150281d5e9e317dc947a0e7719d553176de1cfac37f0292e5649cc"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.443853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" event={"ID":"26dfba7a-f5fa-45bc-a187-91ddce4da2d6","Type":"ContainerStarted","Data":"8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f"} Mar 20 13:47:02 crc kubenswrapper[4755]: I0320 13:47:02.302132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.302317 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.302409 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:06.302385824 +0000 UTC m=+1005.900318353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.525886 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.526282 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.532258 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:02 crc kubenswrapper[4755]: I0320 13:47:02.708607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.708824 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.708922 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:06.708896813 +0000 UTC m=+1006.306829332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: I0320 13:47:03.217105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:03 crc kubenswrapper[4755]: I0320 13:47:03.217217 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217371 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217381 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217438 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:07.217417388 +0000 UTC m=+1006.815349917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217494 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:07.217459619 +0000 UTC m=+1006.815392248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: I0320 13:47:06.393760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.394062 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.394260 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:14.394235544 +0000 UTC m=+1013.992168073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: I0320 13:47:06.799922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.800201 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.800320 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:14.800292641 +0000 UTC m=+1014.398225180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: I0320 13:47:07.308889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:07 crc kubenswrapper[4755]: I0320 13:47:07.309537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.309798 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.309975 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:15.309909636 +0000 UTC m=+1014.907842245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.310012 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.310137 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:15.310110061 +0000 UTC m=+1014.908042590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: I0320 13:47:14.428554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.428965 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.429482 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:30.429448225 +0000 UTC m=+1030.027380764 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: I0320 13:47:14.836132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.836724 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.836971 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:30.836941161 +0000 UTC m=+1030.434873730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: I0320 13:47:15.346809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:15 crc kubenswrapper[4755]: I0320 13:47:15.347591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347000 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347717 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:31.347687397 +0000 UTC m=+1030.945619936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347861 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347972 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:31.347942303 +0000 UTC m=+1030.945874842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.887598 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.887967 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b8b2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-2xgt8_openstack-operators(72ea5d65-7221-4b25-9025-7a5c31bae331): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.889144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podUID="72ea5d65-7221-4b25-9025-7a5c31bae331" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.465330 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.465563 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qg89d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-f2nbs_openstack-operators(21c9358d-2c84-4c38-9c91-8ca3dad4dab7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.466817 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podUID="21c9358d-2c84-4c38-9c91-8ca3dad4dab7" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.640352 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podUID="72ea5d65-7221-4b25-9025-7a5c31bae331" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.641419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podUID="21c9358d-2c84-4c38-9c91-8ca3dad4dab7" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.092205 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.092700 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mb8qk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-c8crg_openstack-operators(1aaef0d5-16fe-4c61-82d5-660f29168171): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.094007 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podUID="1aaef0d5-16fe-4c61-82d5-660f29168171" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.563351 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.563558 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wz6tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-4khh5_openstack-operators(26dfba7a-f5fa-45bc-a187-91ddce4da2d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.564745 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podUID="26dfba7a-f5fa-45bc-a187-91ddce4da2d6" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.648032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podUID="26dfba7a-f5fa-45bc-a187-91ddce4da2d6" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.651357 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podUID="1aaef0d5-16fe-4c61-82d5-660f29168171" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.057830 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.058088 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fz62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-j7qf2_openstack-operators(a1b32bae-fa65-45aa-a8db-b46a7351ee2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.059290 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podUID="a1b32bae-fa65-45aa-a8db-b46a7351ee2c" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.668118 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podUID="a1b32bae-fa65-45aa-a8db-b46a7351ee2c" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.774277 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.774494 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zh9w7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-z9px7_openstack-operators(88862bd4-c890-447c-b4ee-b9cb1a4928e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.775676 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podUID="88862bd4-c890-447c-b4ee-b9cb1a4928e8" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.674395 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podUID="88862bd4-c890-447c-b4ee-b9cb1a4928e8" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.972852 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.973093 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdzd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-4x5nd_openstack-operators(5a83ca27-3334-4aac-9129-5635d3af0714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.974339 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podUID="5a83ca27-3334-4aac-9129-5635d3af0714" Mar 20 13:47:21 crc kubenswrapper[4755]: E0320 13:47:21.683708 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podUID="5a83ca27-3334-4aac-9129-5635d3af0714" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.702278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" event={"ID":"bc93761d-ecc1-4179-8287-40fd76ba5ad1","Type":"ContainerStarted","Data":"51f77d82208f79ab9b94d6de5a02f4516479d986ab94fd2d8e56200808a34d71"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.702975 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.708963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" event={"ID":"14b4b9ba-026c-4fd7-a57d-545e62b6981e","Type":"ContainerStarted","Data":"13ec6aa6d4ffdedffdb707080bbc0f6583b0cab4a5436d93282645c138d43db0"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.709305 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.715226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" event={"ID":"52210224-8989-4e16-8fdf-4ea3a8211b10","Type":"ContainerStarted","Data":"b1783584712b3e83e4d823314ce43ccfb4ea143bf780290131daa70b1304a4fd"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.716132 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.718246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" event={"ID":"fe4ddc70-f382-4b32-8879-122023b45438","Type":"ContainerStarted","Data":"6831d686167bae994649acf600a2c17990e095119528af9e7ccbc52400b4d021"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.719249 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.728769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" event={"ID":"b3c037b9-79d2-45ea-9b92-66e50eb20e6b","Type":"ContainerStarted","Data":"0955ee0cf5b2c84c17e46b7dad60c95b464ad8c066a3c63ba1c0bd2926469eee"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.729403 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.734494 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podStartSLOduration=2.995093318 podStartE2EDuration="24.734480305s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.441952431 +0000 UTC m=+1000.039884960" lastFinishedPulling="2026-03-20 13:47:22.181339408 +0000 UTC m=+1021.779271947" observedRunningTime="2026-03-20 13:47:22.729409918 +0000 UTC m=+1022.327342457" watchObservedRunningTime="2026-03-20 13:47:22.734480305 +0000 UTC m=+1022.332412834" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.745241 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.755743 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" podStartSLOduration=2.7728989029999997 podStartE2EDuration="24.755709043s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.12288109 +0000 UTC m=+999.720813619" lastFinishedPulling="2026-03-20 13:47:22.10569123 +0000 UTC m=+1021.703623759" observedRunningTime="2026-03-20 13:47:22.750419739 +0000 UTC m=+1022.348352268" watchObservedRunningTime="2026-03-20 13:47:22.755709043 +0000 UTC m=+1022.353641572" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.762300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" event={"ID":"7f51051e-6a90-4582-a411-28a106c37118","Type":"ContainerStarted","Data":"d8ee2702de03b4e28f3a6252433fb6aa3bbe6e0cee5e292c6d0831cbacbeed8d"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.762999 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.772999 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" podStartSLOduration=4.083491819 podStartE2EDuration="24.772985493s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.263612529 +0000 UTC m=+999.861545058" lastFinishedPulling="2026-03-20 13:47:20.953106203 +0000 UTC m=+1020.551038732" observedRunningTime="2026-03-20 13:47:22.76956498 +0000 UTC m=+1022.367497509" watchObservedRunningTime="2026-03-20 13:47:22.772985493 +0000 UTC m=+1022.370918022" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.775724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" event={"ID":"00fc80a4-4ea8-4f61-8795-6473f0adc40a","Type":"ContainerStarted","Data":"b26ea63fb4e3917cc230ce1f459655861bf280c4e156c1b0785ce4f51952f639"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.776764 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.787882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" event={"ID":"bc80030a-428b-4643-9d8d-2b0e9c873060","Type":"ContainerStarted","Data":"7060ec59cc1e97cb50ff8fab5f93ce9d52d1cf68fb1fe5ba256f5ef9dd6ffed5"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.788773 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.799162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" event={"ID":"552e0390-e86e-4972-bf6f-a4570e6b6f81","Type":"ContainerStarted","Data":"50ed27fc586ae65beb3715305a9bd491d09e4dff9a3971affeb8b32ab377236e"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.800450 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.810794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" event={"ID":"3a22a8d8-92cd-4177-a597-9c659673392c","Type":"ContainerStarted","Data":"4eb791f1d0a17ac4767e8a95fb5074e97da3c08cf17981e4a762f607691726b9"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.811623 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.820398 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" podStartSLOduration=2.932545337 podStartE2EDuration="24.820369662s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.619900006 +0000 UTC m=+999.217832535" lastFinishedPulling="2026-03-20 13:47:21.507724321 +0000 UTC m=+1021.105656860" observedRunningTime="2026-03-20 13:47:22.802711162 +0000 UTC m=+1022.400643691" watchObservedRunningTime="2026-03-20 13:47:22.820369662 +0000 UTC m=+1022.418302201" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.836034 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podStartSLOduration=3.000441984 podStartE2EDuration="24.836004238s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.446575336 +0000 UTC m=+1000.044507865" lastFinishedPulling="2026-03-20 13:47:22.28213759 +0000 UTC m=+1021.880070119" observedRunningTime="2026-03-20 13:47:22.831906266 +0000 UTC m=+1022.429838795" watchObservedRunningTime="2026-03-20 13:47:22.836004238 +0000 UTC m=+1022.433936767" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.870096 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" podStartSLOduration=2.749760195 podStartE2EDuration="24.870070565s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.985339128 +0000 UTC m=+999.583271657" lastFinishedPulling="2026-03-20 13:47:22.105649488 +0000 UTC m=+1021.703582027" observedRunningTime="2026-03-20 13:47:22.865573132 +0000 UTC m=+1022.463505661" watchObservedRunningTime="2026-03-20 13:47:22.870070565 +0000 UTC m=+1022.468003094" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.904358 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" podStartSLOduration=2.26292405 podStartE2EDuration="24.904332107s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.464295153 +0000 UTC m=+999.062227682" lastFinishedPulling="2026-03-20 13:47:22.10570321 +0000 UTC m=+1021.703635739" observedRunningTime="2026-03-20 13:47:22.900150672 +0000 UTC m=+1022.498083201" watchObservedRunningTime="2026-03-20 13:47:22.904332107 +0000 UTC m=+1022.502264636" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.936235 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" podStartSLOduration=3.648512985 podStartE2EDuration="24.936213764s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.220055273 +0000 UTC m=+999.817987802" lastFinishedPulling="2026-03-20 13:47:21.507756052 +0000 UTC m=+1021.105688581" observedRunningTime="2026-03-20 13:47:22.934503478 +0000 UTC m=+1022.532435997" watchObservedRunningTime="2026-03-20 13:47:22.936213764 +0000 UTC m=+1022.534146293" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.965167 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" podStartSLOduration=3.571256584 podStartE2EDuration="24.965145131s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.113972068 +0000 UTC m=+999.711904597" lastFinishedPulling="2026-03-20 13:47:21.507860605 +0000 UTC m=+1021.105793144" observedRunningTime="2026-03-20 13:47:22.96214332 +0000 UTC m=+1022.560075849" watchObservedRunningTime="2026-03-20 13:47:22.965145131 +0000 UTC m=+1022.563077660" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.988829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" podStartSLOduration=3.398534864 podStartE2EDuration="24.988805145s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.916911706 +0000 UTC m=+999.514844235" lastFinishedPulling="2026-03-20 13:47:21.507181977 +0000 UTC m=+1021.105114516" observedRunningTime="2026-03-20 13:47:22.985465684 +0000 UTC m=+1022.583398213" watchObservedRunningTime="2026-03-20 13:47:22.988805145 +0000 UTC m=+1022.586737674" Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.027034 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" podStartSLOduration=2.312835588 podStartE2EDuration="25.027012644s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.391509193 +0000 UTC m=+998.989441722" lastFinishedPulling="2026-03-20 13:47:22.105686249 +0000 UTC m=+1021.703618778" observedRunningTime="2026-03-20 13:47:23.021252848 +0000 UTC m=+1022.619185377" watchObservedRunningTime="2026-03-20 13:47:23.027012644 +0000 UTC m=+1022.624945173" Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.822542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" event={"ID":"4c1ba89a-aed6-4245-8411-4d1fecac2500","Type":"ContainerStarted","Data":"e5f4e30e876b9cc4e7fceeaf44b19e4be593da0a7bcdb75531556284eec32389"} Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.824686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" event={"ID":"358d4809-db3b-4468-8c8c-4ffbedc0ec89","Type":"ContainerStarted","Data":"04b4df9a34323cfb67606ee38e56a0376b5b92c846b8222838e32d1b05b256c5"} Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.854030 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podStartSLOduration=4.041442756 podStartE2EDuration="25.853998603s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.36657469 +0000 UTC m=+999.964507219" lastFinishedPulling="2026-03-20 13:47:22.179130537 +0000 UTC m=+1021.777063066" observedRunningTime="2026-03-20 13:47:23.847808744 +0000 UTC m=+1023.445741293" watchObservedRunningTime="2026-03-20 13:47:23.853998603 +0000 UTC m=+1023.451931132" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.689714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.707957 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.865952 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.026385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.048867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.076568 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.145582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.149215 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.220721 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.222569 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.260989 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.335271 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.454528 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.877688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" event={"ID":"72ea5d65-7221-4b25-9025-7a5c31bae331","Type":"ContainerStarted","Data":"834f8a22d24478ac5b10d12b1369e14e15adff26e22af3f9492365ffb6d826f9"} Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.878092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.896798 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podStartSLOduration=3.58105853 podStartE2EDuration="31.896769348s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.356161846 +0000 UTC m=+999.954094375" lastFinishedPulling="2026-03-20 13:47:28.671872664 +0000 UTC m=+1028.269805193" observedRunningTime="2026-03-20 13:47:29.895571036 +0000 UTC m=+1029.493503595" watchObservedRunningTime="2026-03-20 13:47:29.896769348 +0000 UTC m=+1029.494701897" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.486107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.499646 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.667848 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l7q5f" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.676392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.896718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.905107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.069831 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:47:31 crc kubenswrapper[4755]: W0320 13:47:31.080717 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d6120d_b54b_452c_aa8a_026665f1afae.slice/crio-3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb WatchSource:0}: Error finding container 3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb: Status 404 returned error can't find the container with id 3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.125868 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n4kd2" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.129935 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.406931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.407438 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.413217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.413598 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.567214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.615484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7dlnc" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.622352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.908504 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.914688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" event={"ID":"bad91c65-94da-4f8a-addb-21b037197217","Type":"ContainerStarted","Data":"3d81e557f89e6ec8ca35b9228ca4108698597194ed24f29a155f62a42056e450"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.915809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" event={"ID":"1aaef0d5-16fe-4c61-82d5-660f29168171","Type":"ContainerStarted","Data":"b0c9a875a201c24b71b411179f6292b2276738811db10345de6d877b34630120"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.916752 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.920112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" event={"ID":"83d6120d-b54b-452c-aa8a-026665f1afae","Type":"ContainerStarted","Data":"3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.936259 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podStartSLOduration=2.536412381 podStartE2EDuration="33.936237142s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.285293429 +0000 UTC m=+999.883225958" lastFinishedPulling="2026-03-20 13:47:31.68511819 +0000 UTC m=+1031.283050719" observedRunningTime="2026-03-20 13:47:31.93063335 +0000 UTC m=+1031.528565869" watchObservedRunningTime="2026-03-20 13:47:31.936237142 +0000 UTC m=+1031.534169671" Mar 20 13:47:32 crc kubenswrapper[4755]: W0320 13:47:32.615917 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c9c167_c386_4d60_868c_8b0b63fccbcd.slice/crio-667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211 WatchSource:0}: Error finding container 667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211: Status 404 returned error can't find the container with id 667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211 Mar 20 13:47:32 crc kubenswrapper[4755]: I0320 13:47:32.933465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" event={"ID":"42c9c167-c386-4d60-868c-8b0b63fccbcd","Type":"ContainerStarted","Data":"667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.944579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" event={"ID":"42c9c167-c386-4d60-868c-8b0b63fccbcd","Type":"ContainerStarted","Data":"d3610573dcd7197c2967de8d19220fa13ea282ce0acaa0b9586acababdccc496"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.945025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.947246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" event={"ID":"21c9358d-2c84-4c38-9c91-8ca3dad4dab7","Type":"ContainerStarted","Data":"db826354ebdf6b02c5f701b694b96a03663af0434e028a4e6bac7728a9c56c64"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.947512 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.949966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" event={"ID":"83d6120d-b54b-452c-aa8a-026665f1afae","Type":"ContainerStarted","Data":"6272366f186751bdf2848316a190c168b3e037536e78d9828e4fb8fc3b6574a8"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.950165 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.952723 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" event={"ID":"26dfba7a-f5fa-45bc-a187-91ddce4da2d6","Type":"ContainerStarted","Data":"d572a9132b41c964880fce2fe841c76c8023675373e0cbe6da6efb6508fe51dd"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.953381 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.955695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" event={"ID":"5a83ca27-3334-4aac-9129-5635d3af0714","Type":"ContainerStarted","Data":"09b0ae48ccfbe35e11690a2786dc6c6457b191d49902845e8b4f81381ca43a9b"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.956110 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.997749 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podStartSLOduration=1.9098457450000002 podStartE2EDuration="35.997730986s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.619526796 +0000 UTC m=+999.217459315" lastFinishedPulling="2026-03-20 13:47:33.707412027 +0000 UTC m=+1033.305344556" observedRunningTime="2026-03-20 13:47:33.996302116 +0000 UTC m=+1033.594234655" watchObservedRunningTime="2026-03-20 13:47:33.997730986 +0000 UTC m=+1033.595663515" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.002100 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" podStartSLOduration=35.002090544 podStartE2EDuration="35.002090544s" podCreationTimestamp="2026-03-20 13:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:33.980806975 +0000 UTC m=+1033.578739534" watchObservedRunningTime="2026-03-20 13:47:34.002090544 +0000 UTC m=+1033.600023073" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.034023 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" podStartSLOduration=34.308931612 podStartE2EDuration="36.034007643s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:31.083270177 +0000 UTC m=+1030.681202706" lastFinishedPulling="2026-03-20 13:47:32.808346208 +0000 UTC m=+1032.406278737" observedRunningTime="2026-03-20 13:47:34.030617961 +0000 UTC m=+1033.628550500" watchObservedRunningTime="2026-03-20 13:47:34.034007643 +0000 UTC m=+1033.631940172" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.061047 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podStartSLOduration=3.694472676 podStartE2EDuration="36.061031038s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.441898779 +0000 UTC m=+1000.039831308" lastFinishedPulling="2026-03-20 13:47:32.808457141 +0000 UTC m=+1032.406389670" observedRunningTime="2026-03-20 13:47:34.060632436 +0000 UTC m=+1033.658564965" watchObservedRunningTime="2026-03-20 13:47:34.061031038 +0000 UTC m=+1033.658963567" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.084546 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podStartSLOduration=2.813726665 podStartE2EDuration="36.084518137s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.538942354 +0000 UTC m=+999.136874883" lastFinishedPulling="2026-03-20 13:47:32.809733826 +0000 UTC m=+1032.407666355" observedRunningTime="2026-03-20 13:47:34.080274881 +0000 UTC m=+1033.678207410" watchObservedRunningTime="2026-03-20 13:47:34.084518137 +0000 UTC m=+1033.682450666" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.971253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" event={"ID":"88862bd4-c890-447c-b4ee-b9cb1a4928e8","Type":"ContainerStarted","Data":"8d5fc94f6b72f307ce4f9c317f2c9532211b1207a5a58aca33c0afe897df4dc6"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.971844 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.973858 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" event={"ID":"bad91c65-94da-4f8a-addb-21b037197217","Type":"ContainerStarted","Data":"ab8f9c7ca3ee651697a5f40226da3f251c2ca377bfe8241e9903e1197190d33a"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.973941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.976118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" event={"ID":"a1b32bae-fa65-45aa-a8db-b46a7351ee2c","Type":"ContainerStarted","Data":"9dfb913aac9e7cd5ba15e0dc7557d209740f1531aaa5fee49dfa2dda0a6c439f"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.976298 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.998752 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podStartSLOduration=3.527684228 podStartE2EDuration="37.998722703s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.62167331 +0000 UTC m=+1000.219605839" lastFinishedPulling="2026-03-20 13:47:35.092711785 +0000 UTC m=+1034.690644314" observedRunningTime="2026-03-20 13:47:35.994544144 +0000 UTC m=+1035.592476693" watchObservedRunningTime="2026-03-20 13:47:35.998722703 +0000 UTC m=+1035.596655262" Mar 20 13:47:36 crc kubenswrapper[4755]: I0320 13:47:36.040761 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" podStartSLOduration=34.532481679 podStartE2EDuration="38.040734274s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:31.583344661 +0000 UTC m=+1031.181277190" lastFinishedPulling="2026-03-20 13:47:35.091597256 +0000 UTC m=+1034.689529785" observedRunningTime="2026-03-20 13:47:36.035944929 +0000 UTC m=+1035.633877498" watchObservedRunningTime="2026-03-20 13:47:36.040734274 +0000 UTC m=+1035.638666813" Mar 20 13:47:36 crc kubenswrapper[4755]: I0320 13:47:36.059973 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podStartSLOduration=3.327256775 podStartE2EDuration="38.059944557s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.361306747 +0000 UTC m=+999.959239276" lastFinishedPulling="2026-03-20 13:47:35.093994529 +0000 UTC m=+1034.691927058" observedRunningTime="2026-03-20 13:47:36.057877963 +0000 UTC m=+1035.655810502" watchObservedRunningTime="2026-03-20 13:47:36.059944557 +0000 UTC m=+1035.657877116" Mar 20 13:47:38 crc kubenswrapper[4755]: I0320 13:47:38.960751 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.018558 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.167031 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.299952 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.618944 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:47:40 crc kubenswrapper[4755]: I0320 13:47:40.686839 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:41 crc kubenswrapper[4755]: I0320 13:47:41.141958 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:41 crc kubenswrapper[4755]: I0320 13:47:41.638018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:49 crc kubenswrapper[4755]: I0320 13:47:49.357919 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:47:49 crc kubenswrapper[4755]: I0320 13:47:49.800170 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.148482 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.149862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.153460 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.153873 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.154161 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.167636 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.261112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.362811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.391434 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.498508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.996069 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:01 crc kubenswrapper[4755]: W0320 13:48:01.006367 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda434c164_9ea6_4062_b8f6_88bb58f41a64.slice/crio-de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6 WatchSource:0}: Error finding container de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6: Status 404 returned error can't find the container with id de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6 Mar 20 13:48:01 crc kubenswrapper[4755]: I0320 13:48:01.202358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerStarted","Data":"de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6"} Mar 20 13:48:04 crc kubenswrapper[4755]: I0320 13:48:04.221774 4755 generic.go:334] "Generic (PLEG): container finished" podID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerID="52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7" exitCode=0 Mar 20 13:48:04 crc kubenswrapper[4755]: I0320 13:48:04.221860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerDied","Data":"52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7"} Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.575231 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.651460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"a434c164-9ea6-4062-b8f6-88bb58f41a64\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.658121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb" (OuterVolumeSpecName: "kube-api-access-rqkwb") pod "a434c164-9ea6-4062-b8f6-88bb58f41a64" (UID: "a434c164-9ea6-4062-b8f6-88bb58f41a64"). InnerVolumeSpecName "kube-api-access-rqkwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.753499 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.248961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerDied","Data":"de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6"} Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.249001 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.249005 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.670727 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.675866 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836440 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:06 crc kubenswrapper[4755]: E0320 13:48:06.836789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836808 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836931 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.837634 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-446mv" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839684 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839812 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839993 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.846411 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.896554 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.898180 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.900756 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.906771 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.973620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974172 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.077173 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.077616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.078023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.078332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.101489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.105034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.151737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.212225 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.247089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320783b7-7554-4157-b6cd-143d787dc30b" path="/var/lib/kubelet/pods/320783b7-7554-4157-b6cd-143d787dc30b/volumes" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.486380 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.527145 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:07 crc kubenswrapper[4755]: W0320 13:48:07.536804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31232794_c643_4a0d_a32c_9bcd76b1e121.slice/crio-be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d WatchSource:0}: Error finding container be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d: Status 404 returned error can't find the container with id be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d Mar 20 13:48:08 crc kubenswrapper[4755]: I0320 13:48:08.278093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" event={"ID":"126a2da5-9f66-4125-9d2a-424cbc297bfd","Type":"ContainerStarted","Data":"6fb1b92200cdb6cb763de2b6dc5645838d502f87f1501b1c535570e9f51eb04d"} Mar 20 13:48:08 crc kubenswrapper[4755]: I0320 13:48:08.280851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" event={"ID":"31232794-c643-4a0d-a32c-9bcd76b1e121","Type":"ContainerStarted","Data":"be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d"} Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.513831 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.536224 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.542863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.546638 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653806 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.755810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.755941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.756188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.757101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.757188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.787226 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.882870 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.950710 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.016849 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.024297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.027437 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.078812 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.078926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.079029 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181882 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.182636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.182941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.204821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.347058 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.474146 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:10 crc kubenswrapper[4755]: W0320 13:48:10.490187 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d52487_e77f_403a_a60e_af716068e035.slice/crio-17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b WatchSource:0}: Error finding container 17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b: Status 404 returned error can't find the container with id 17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.522545 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.531392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.536912 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537312 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537407 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537586 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537737 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.540233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dj4wr" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.548134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.603512 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691186 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691763 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691813 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799059 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799931 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.803491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.803868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.804383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.806194 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.811396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.813411 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.815274 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.817218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.820255 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.823793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.848140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.864040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.924305 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.928929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931221 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931388 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931764 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nsxpj" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931790 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931806 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.932455 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.940050 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104449 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104566 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104706 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.105032 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.105104 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.208872 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.208954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209114 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209248 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.210153 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.210717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.212127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.212140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.213356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.214249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.216762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.227926 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.230232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.233177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.234721 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.265005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.286514 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.317836 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerStarted","Data":"17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b"} Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.319378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerStarted","Data":"b08752445d4cb4a4b8b6c2c978645cf8d6b89df6eef356585e9cb68a217e3d17"} Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.481339 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.483279 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.487593 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.487714 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.489391 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.491123 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.500593 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4drl9" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.506884 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641110 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641213 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.742942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743403 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743900 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.744230 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.744966 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.747256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.750355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.751151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.764327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.780141 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.819955 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.805271 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.807814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.814633 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-v75sj" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817426 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817445 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.830142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968273 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968940 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.969074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.969346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071693 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071824 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.072378 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.074277 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.097192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.100076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.102374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.126118 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.134932 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.136900 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139275 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bw6d5" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139369 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139398 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.147085 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276382 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276750 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379074 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.383530 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.389146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.399263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.456193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.519844 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.463923 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.465293 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.468694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sjqzk" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.473921 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.623294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.724839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.746581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.789303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.454328 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.455556 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461494 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cjqnm" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461614 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.478213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.490386 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.491895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.514802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570576 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672925 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673048 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.676603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.676862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.677677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.678382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.684778 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.686507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.691942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.694076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.779448 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.798095 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.799495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803431 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-l8pkm" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.809373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.832916 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876975 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877148 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978512 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979258 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.980151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.982567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.984044 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.984728 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.997438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:20 crc kubenswrapper[4755]: I0320 13:48:19.999836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:20 crc kubenswrapper[4755]: I0320 13:48:20.128607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:22 crc kubenswrapper[4755]: I0320 13:48:22.594034 4755 scope.go:117] "RemoveContainer" containerID="dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.071887 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.074053 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076189 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076920 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dq7gm" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.079210 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.091068 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175276 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.176074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.176115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277527 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277648 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.278696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.280793 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.280996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.282051 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.285278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.285718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.291357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.299836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.312038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.399984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.658599 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.659096 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndt9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qb4hg_openstack(31232794-c643-4a0d-a32c-9bcd76b1e121): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.660622 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" podUID="31232794-c643-4a0d-a32c-9bcd76b1e121" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.685311 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.685490 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tm97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-j9fdk_openstack(126a2da5-9f66-4125-9d2a-424cbc297bfd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.686963 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" podUID="126a2da5-9f66-4125-9d2a-424cbc297bfd" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.363806 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.366744 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ab8e52_0cde_43ec_af8d_24f794695200.slice/crio-e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e WatchSource:0}: Error finding container e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e: Status 404 returned error can't find the container with id e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.376900 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d21386c_8267_4dba_9028_d5cb729ff78b.slice/crio-d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61 WatchSource:0}: Error finding container d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61: Status 404 returned error can't find the container with id d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.377981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.431623 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.432673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.434954 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8d52487-e77f-403a-a60e-af716068e035" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" exitCode=0 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.435025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.438683 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerID="3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4" exitCode=0 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.439436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.528731 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.534675 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.540501 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1786d302_95f2_410e_8280_14a89cbaf48c.slice/crio-15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc WatchSource:0}: Error finding container 15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc: Status 404 returned error can't find the container with id 15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.540568 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.549423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.553804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1ede4a6_e06a_4084_8ba6_5f1c7f838bbe.slice/crio-901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe WatchSource:0}: Error finding container 901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe: Status 404 returned error can't find the container with id 901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.735962 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed1ecda_4acb_4a4c_a84e_12e58b3ad243.slice/crio-755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc WatchSource:0}: Error finding container 755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc: Status 404 returned error can't find the container with id 755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.739881 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.770785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.842299 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.849610 4755 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 13:48:25 crc kubenswrapper[4755]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:48:25 crc kubenswrapper[4755]: > podSandboxID="17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b" Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.849797 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:48:25 crc kubenswrapper[4755]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7prj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-f44rr_openstack(b8d52487-e77f-403a-a60e-af716068e035): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:48:25 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.850854 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podUID="b8d52487-e77f-403a-a60e-af716068e035" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.923186 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.942429 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031235 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031494 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031535 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"126a2da5-9f66-4125-9d2a-424cbc297bfd\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031563 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"126a2da5-9f66-4125-9d2a-424cbc297bfd\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config" (OuterVolumeSpecName: "config") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032201 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config" (OuterVolumeSpecName: "config") pod "126a2da5-9f66-4125-9d2a-424cbc297bfd" (UID: "126a2da5-9f66-4125-9d2a-424cbc297bfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.036959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97" (OuterVolumeSpecName: "kube-api-access-4tm97") pod "126a2da5-9f66-4125-9d2a-424cbc297bfd" (UID: "126a2da5-9f66-4125-9d2a-424cbc297bfd"). InnerVolumeSpecName "kube-api-access-4tm97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.037232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d" (OuterVolumeSpecName: "kube-api-access-ndt9d") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "kube-api-access-ndt9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.133793 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134207 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134219 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134297 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134308 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.451805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.453732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"a0e5580e48257aae362f608b62e3bbc6073b0ba593958e3e0816180b07437a59"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.455939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"71c82861b041c19bf20384296bc113b1205d4d94c31287f2a85b68ac81cef0b4"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.457835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.459197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1786d302-95f2-410e-8280-14a89cbaf48c","Type":"ContainerStarted","Data":"15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.460788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" event={"ID":"31232794-c643-4a0d-a32c-9bcd76b1e121","Type":"ContainerDied","Data":"be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.460827 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.464385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp" event={"ID":"408d869f-0966-4908-88e5-37cdff345c4a","Type":"ContainerStarted","Data":"814c7070b25ab1fe6ca3c907111fcae65c59840a0d99df62c010f837dc326347"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.465789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" event={"ID":"126a2da5-9f66-4125-9d2a-424cbc297bfd","Type":"ContainerDied","Data":"6fb1b92200cdb6cb763de2b6dc5645838d502f87f1501b1c535570e9f51eb04d"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.465829 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.469186 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerStarted","Data":"93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.469800 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.472154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerStarted","Data":"1f936cfbd135019d1572ee465a4fb61fade57721a1a7701a47ec15a9bf86c1cd"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.520712 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" podStartSLOduration=3.295801188 podStartE2EDuration="17.520690948s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:10.61117297 +0000 UTC m=+1070.209105499" lastFinishedPulling="2026-03-20 13:48:24.83606272 +0000 UTC m=+1084.433995259" observedRunningTime="2026-03-20 13:48:26.495103558 +0000 UTC m=+1086.093036097" watchObservedRunningTime="2026-03-20 13:48:26.520690948 +0000 UTC m=+1086.118623477" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.554939 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.622025 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.632430 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.639370 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.692508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:27 crc kubenswrapper[4755]: I0320 13:48:27.237317 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126a2da5-9f66-4125-9d2a-424cbc297bfd" path="/var/lib/kubelet/pods/126a2da5-9f66-4125-9d2a-424cbc297bfd/volumes" Mar 20 13:48:27 crc kubenswrapper[4755]: I0320 13:48:27.237704 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31232794-c643-4a0d-a32c-9bcd76b1e121" path="/var/lib/kubelet/pods/31232794-c643-4a0d-a32c-9bcd76b1e121/volumes" Mar 20 13:48:30 crc kubenswrapper[4755]: W0320 13:48:30.291467 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde877bb8_b1cd_45de_94c1_5242659fd03e.slice/crio-903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae WatchSource:0}: Error finding container 903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae: Status 404 returned error can't find the container with id 903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.348876 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.400500 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.508884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae"} Mar 20 13:48:36 crc kubenswrapper[4755]: I0320 13:48:36.750841 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:36 crc kubenswrapper[4755]: I0320 13:48:36.751463 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.606883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.614280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerStarted","Data":"8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.615082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.621829 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1786d302-95f2-410e-8280-14a89cbaf48c","Type":"ContainerStarted","Data":"ac8f807fc1fcca9ddc96ee0bd88bc0a23040ab4b9e40a2dd44b106733da238e6"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.622033 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerStarted","Data":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633606 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" containerID="cri-o://4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" gracePeriod=10 Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633725 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.651965 4755 generic.go:334] "Generic (PLEG): container finished" podID="b2961ad5-0d2c-46e9-bb50-2e2893353945" containerID="bf4e9e4330683d4ccd79b8f36beff0599ba6add00c8a7a23493065253bae6878" exitCode=0 Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.652041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerDied","Data":"bf4e9e4330683d4ccd79b8f36beff0599ba6add00c8a7a23493065253bae6878"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.658602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp" event={"ID":"408d869f-0966-4908-88e5-37cdff345c4a","Type":"ContainerStarted","Data":"13af441eb89a42cfa2fddf8e375445e903e8202c57fe131607cbbae52bf16654"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.658703 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.671756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"af947e5dc0897b31ec19fa3b48dc856fb157faf9cb5872ca5f78522c8e383953"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.700288 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.812029022 podStartE2EDuration="27.700266002s" podCreationTimestamp="2026-03-20 13:48:14 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.546853262 +0000 UTC m=+1085.144785791" lastFinishedPulling="2026-03-20 13:48:37.435090242 +0000 UTC m=+1097.033022771" observedRunningTime="2026-03-20 13:48:41.663984103 +0000 UTC m=+1101.261916632" watchObservedRunningTime="2026-03-20 13:48:41.700266002 +0000 UTC m=+1101.298198531" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.708749 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.686443541 podStartE2EDuration="25.708722985s" podCreationTimestamp="2026-03-20 13:48:16 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.562988665 +0000 UTC m=+1085.160921194" lastFinishedPulling="2026-03-20 13:48:40.585268099 +0000 UTC m=+1100.183200638" observedRunningTime="2026-03-20 13:48:41.682846096 +0000 UTC m=+1101.280778625" watchObservedRunningTime="2026-03-20 13:48:41.708722985 +0000 UTC m=+1101.306655514" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.714632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"a0dd7b7fc6c15f4c319fe9bcd82f91e32adc8dc7930b71d2b5f3803ace890852"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.722112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.727252 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podStartSLOduration=18.360279707 podStartE2EDuration="32.727196398s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:10.496997998 +0000 UTC m=+1070.094930527" lastFinishedPulling="2026-03-20 13:48:24.863914689 +0000 UTC m=+1084.461847218" observedRunningTime="2026-03-20 13:48:41.716234001 +0000 UTC m=+1101.314166540" watchObservedRunningTime="2026-03-20 13:48:41.727196398 +0000 UTC m=+1101.325128917" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.749941 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kbcdp" podStartSLOduration=8.687077038 podStartE2EDuration="22.749917184s" podCreationTimestamp="2026-03-20 13:48:19 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.774032674 +0000 UTC m=+1085.371965223" lastFinishedPulling="2026-03-20 13:48:39.83687282 +0000 UTC m=+1099.434805369" observedRunningTime="2026-03-20 13:48:41.739126941 +0000 UTC m=+1101.337059480" watchObservedRunningTime="2026-03-20 13:48:41.749917184 +0000 UTC m=+1101.347849713" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.185563 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.359850 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.360278 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.360369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.368815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7" (OuterVolumeSpecName: "kube-api-access-7prj7") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "kube-api-access-7prj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.401175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config" (OuterVolumeSpecName: "config") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.402705 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462118 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462172 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462185 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735280 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8d52487-e77f-403a-a60e-af716068e035" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" exitCode=0 Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735442 4755 scope.go:117] "RemoveContainer" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735616 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.749595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.767980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"f07f9cde3762a0eea76351152ab9e9747c414ceeede6f6c4913d32a01cfb2e75"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"d6047cc707b2bec5222eb333a26a6ecf68c2ecb99d4f55acbe0341219636bd6a"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768393 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768462 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.780260 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.780891 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="init" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781334 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="init" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.781368 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781377 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781635 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.783132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.790109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.794976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.802819 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.817728 4755 scope.go:117] "RemoveContainer" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.872124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873753 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.874028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.878265 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.889285 4755 scope.go:117] "RemoveContainer" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.892064 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": container with ID starting with 4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0 not found: ID does not exist" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892101 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} err="failed to get container status \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": rpc error: code = NotFound desc = could not find container \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": container with ID starting with 4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0 not found: ID does not exist" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892121 4755 scope.go:117] "RemoveContainer" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.892490 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": container with ID starting with f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a not found: ID does not exist" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892513 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a"} err="failed to get container status \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": rpc error: code = NotFound desc = could not find container \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": container with ID starting with f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a not found: ID does not exist" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.894113 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.909819 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wbxnd" podStartSLOduration=9.976301507 podStartE2EDuration="23.909784563s" podCreationTimestamp="2026-03-20 13:48:19 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.903353564 +0000 UTC m=+1085.501286093" lastFinishedPulling="2026-03-20 13:48:39.83683661 +0000 UTC m=+1099.434769149" observedRunningTime="2026-03-20 13:48:42.881483761 +0000 UTC m=+1102.479416300" watchObservedRunningTime="2026-03-20 13:48:42.909784563 +0000 UTC m=+1102.507717082" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.939541 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.941943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.943893 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976444 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976442 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.991110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.996540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.009623 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077337 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077418 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077449 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.081545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.082945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.084025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.108029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.128112 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.250011 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d52487-e77f-403a-a60e-af716068e035" path="/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volumes" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.250983 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.251361 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.259215 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.263071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.276116 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.280290 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383724 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486387 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486410 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.487578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.487909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.489608 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.490859 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.507067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.603812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.816179 4755 generic.go:334] "Generic (PLEG): container finished" podID="23ab8e52-0cde-43ec-af8d-24f794695200" containerID="8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e" exitCode=0 Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.816361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerDied","Data":"8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e"} Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.823722 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe" containerID="e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb" exitCode=0 Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.823774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerDied","Data":"e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb"} Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.805171 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0e99e7_7429_41a7_bff7_23cafba6b78a.slice/crio-f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b WatchSource:0}: Error finding container f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b: Status 404 returned error can't find the container with id f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.805396 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.808489 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bec7874_a7ec_4bf9_a716_0bf6bb9563fa.slice/crio-011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb WatchSource:0}: Error finding container 011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb: Status 404 returned error can't find the container with id 011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.812169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.839826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmwms" event={"ID":"3a0e99e7-7429-41a7-bff7-23cafba6b78a","Type":"ContainerStarted","Data":"f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.841135 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerStarted","Data":"011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.843782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"fb83634faa1a093a5211224c7e5c8a5272ed30f71d037303f0881e8562abb624"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.846009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"d2dc1b19311c9fbc046bc5201d86d95ee2a2ecb1e4d0343efe49d55d2eb775c9"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.847937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"b69fac00382c6565cb7d427ec68d14180948f8f53f335adc13978223db89cf79"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.850333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"cc1d3bbf69eaa17d4630b689bbb4a5fbe1d0b3d5011e9e32117acb1017cc80b2"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.868980 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.257119624 podStartE2EDuration="27.868960566s" podCreationTimestamp="2026-03-20 13:48:18 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.741076132 +0000 UTC m=+1085.339008661" lastFinishedPulling="2026-03-20 13:48:45.352917074 +0000 UTC m=+1104.950849603" observedRunningTime="2026-03-20 13:48:45.86111255 +0000 UTC m=+1105.459045079" watchObservedRunningTime="2026-03-20 13:48:45.868960566 +0000 UTC m=+1105.466893095" Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.889287 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.863543174 podStartE2EDuration="23.889267508s" podCreationTimestamp="2026-03-20 13:48:22 +0000 UTC" firstStartedPulling="2026-03-20 13:48:30.294182046 +0000 UTC m=+1089.892114575" lastFinishedPulling="2026-03-20 13:48:45.31990635 +0000 UTC m=+1104.917838909" observedRunningTime="2026-03-20 13:48:45.886349741 +0000 UTC m=+1105.484282270" watchObservedRunningTime="2026-03-20 13:48:45.889267508 +0000 UTC m=+1105.487200037" Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.925098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.928189 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.96025484 podStartE2EDuration="33.928162037s" podCreationTimestamp="2026-03-20 13:48:12 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.563489468 +0000 UTC m=+1085.161421997" lastFinishedPulling="2026-03-20 13:48:37.531396665 +0000 UTC m=+1097.129329194" observedRunningTime="2026-03-20 13:48:45.922359455 +0000 UTC m=+1105.520291974" watchObservedRunningTime="2026-03-20 13:48:45.928162037 +0000 UTC m=+1105.526094566" Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.940194 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb WatchSource:0}: Error finding container a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb: Status 404 returned error can't find the container with id a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.957270 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.92445026 podStartE2EDuration="34.957242869s" podCreationTimestamp="2026-03-20 13:48:11 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.369482075 +0000 UTC m=+1084.967414614" lastFinishedPulling="2026-03-20 13:48:39.402274704 +0000 UTC m=+1099.000207223" observedRunningTime="2026-03-20 13:48:45.94507519 +0000 UTC m=+1105.543007739" watchObservedRunningTime="2026-03-20 13:48:45.957242869 +0000 UTC m=+1105.555175398" Mar 20 13:48:46 crc kubenswrapper[4755]: E0320 13:48:46.282214 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-conmon-11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.797682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.864278 4755 generic.go:334] "Generic (PLEG): container finished" podID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" exitCode=0 Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.864407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871018 4755 generic.go:334] "Generic (PLEG): container finished" podID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerID="11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97" exitCode=0 Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerDied","Data":"11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerStarted","Data":"a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.886096 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmwms" event={"ID":"3a0e99e7-7429-41a7-bff7-23cafba6b78a","Type":"ContainerStarted","Data":"f6294141f948778e685f9c991c03d03c378e3300d2b33d4ffcdb6e4693d6c079"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.960578 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nmwms" podStartSLOduration=4.960553426 podStartE2EDuration="4.960553426s" podCreationTimestamp="2026-03-20 13:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:46.950601565 +0000 UTC m=+1106.548534094" watchObservedRunningTime="2026-03-20 13:48:46.960553426 +0000 UTC m=+1106.558485955" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.129189 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.172289 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.289454 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.366261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.366385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.367012 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.367078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.375130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4" (OuterVolumeSpecName: "kube-api-access-x5md4") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "kube-api-access-x5md4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.385805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config" (OuterVolumeSpecName: "config") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.401159 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.403421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.409397 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.459558 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471081 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471114 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471125 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471134 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.903322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerStarted","Data":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.903712 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.906616 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerDied","Data":"a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb"} Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.906803 4755 scope.go:117] "RemoveContainer" containerID="11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.907063 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.908190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.908223 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: E0320 13:48:47.911806 4755 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.181:59796->38.102.83.181:38787: read tcp 38.102.83.181:59796->38.102.83.181:38787: read: connection reset by peer Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.954070 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-jb5zr" podStartSLOduration=4.954045956 podStartE2EDuration="4.954045956s" podCreationTimestamp="2026-03-20 13:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:47.939766172 +0000 UTC m=+1107.537698711" watchObservedRunningTime="2026-03-20 13:48:47.954045956 +0000 UTC m=+1107.551978675" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.966394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.976806 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.074182 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.081560 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.324581 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:48 crc kubenswrapper[4755]: E0320 13:48:48.325029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.325052 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.325266 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.326212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.329893 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qqrf6" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.330088 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.330129 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.335765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.346440 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.391946 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.391987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392055 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495991 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.499440 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.509774 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.511115 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.512968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.647353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.133822 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:49 crc kubenswrapper[4755]: W0320 13:48:49.135627 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1bdd912_fe33_4449_aed8_12a5ee09961e.slice/crio-8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63 WatchSource:0}: Error finding container 8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63: Status 404 returned error can't find the container with id 8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63 Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.236193 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" path="/var/lib/kubelet/pods/d8e3517b-eb0a-41ae-8dde-78040dd4088e/volumes" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.521944 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.935842 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.954687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"57f8bc7ef5c60aaea9ff6f9d76059002e2b138b6ca44c5c979810e197fb70ce1"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.955118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"3a074e42c4e9a527a8f95be94d47cd42be948d4db4396fea9ffc29303c88cb72"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.957012 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.995590 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.91711162 podStartE2EDuration="2.995571637s" podCreationTimestamp="2026-03-20 13:48:48 +0000 UTC" firstStartedPulling="2026-03-20 13:48:49.138474979 +0000 UTC m=+1108.736407518" lastFinishedPulling="2026-03-20 13:48:50.216934986 +0000 UTC m=+1109.814867535" observedRunningTime="2026-03-20 13:48:50.984488406 +0000 UTC m=+1110.582420935" watchObservedRunningTime="2026-03-20 13:48:50.995571637 +0000 UTC m=+1110.593504156" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.820923 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.821481 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.945871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.084747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.606022 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.695352 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.697422 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" containerID="cri-o://93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" gracePeriod=10 Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.457379 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.457499 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.551022 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.987386 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerID="93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" exitCode=0 Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.987418 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74"} Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.090219 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.307821 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.462239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4" (OuterVolumeSpecName: "kube-api-access-cmdb4") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "kube-api-access-cmdb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:55 crc kubenswrapper[4755]: E0320 13:48:55.470807 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470827 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: E0320 13:48:55.470923 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="init" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470934 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="init" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.471134 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.471593 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.474399 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.481609 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.490436 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config" (OuterVolumeSpecName: "config") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.500616 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.524283 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.525446 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.535887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.535974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536037 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536051 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536063 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536248 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.638771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.664515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.696372 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.697929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.709427 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.739616 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.739718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.740684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.758301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.795285 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.796577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.798804 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.805962 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.841134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.841260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.873339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.878219 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943577 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.944706 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.967296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.011520 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.012080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"b08752445d4cb4a4b8b6c2c978645cf8d6b89df6eef356585e9cb68a217e3d17"} Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.012126 4755 scope.go:117] "RemoveContainer" containerID="93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.015574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.045259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.045348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.046033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.047227 4755 scope.go:117] "RemoveContainer" containerID="3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.076121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.081484 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.100534 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.111625 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.353353 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c00857_0d6a_4c12_8581_da16e2a24f04.slice/crio-fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034 WatchSource:0}: Error finding container fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034: Status 404 returned error can't find the container with id fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.356915 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.404286 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.405785 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0587eb58_cd5e_4e0b_be30_97e0a569fc57.slice/crio-cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69 WatchSource:0}: Error finding container cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69: Status 404 returned error can't find the container with id cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.428974 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.433743 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0795b626_b382_4b9b_beb5_802cebc4f764.slice/crio-92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af WatchSource:0}: Error finding container 92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af: Status 404 returned error can't find the container with id 92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.510289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.515443 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af42784_d5cc_4f7c_832a_f91dbd54cc3f.slice/crio-ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4 WatchSource:0}: Error finding container ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4: Status 404 returned error can't find the container with id ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.872210 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.873607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.940026 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.965955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966056 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.027974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerStarted","Data":"80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.028018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerStarted","Data":"ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.037871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerStarted","Data":"9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.037919 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerStarted","Data":"fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.041356 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerStarted","Data":"0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.041398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerStarted","Data":"cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.043711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerStarted","Data":"674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.043743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerStarted","Data":"92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.051228 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-ng8vm" podStartSLOduration=2.051210608 podStartE2EDuration="2.051210608s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.047388588 +0000 UTC m=+1116.645321117" watchObservedRunningTime="2026-03-20 13:48:57.051210608 +0000 UTC m=+1116.649143137" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.066636 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-g2hvs" podStartSLOduration=2.066615642 podStartE2EDuration="2.066615642s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.062134334 +0000 UTC m=+1116.660066863" watchObservedRunningTime="2026-03-20 13:48:57.066615642 +0000 UTC m=+1116.664548171" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.069135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.069640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.070518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.071193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.087441 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9157-account-create-update-q8r48" podStartSLOduration=2.087417926 podStartE2EDuration="2.087417926s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.083202637 +0000 UTC m=+1116.681135176" watchObservedRunningTime="2026-03-20 13:48:57.087417926 +0000 UTC m=+1116.685350465" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.091415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.104867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8e2f-account-create-update-cvvh2" podStartSLOduration=2.104847533 podStartE2EDuration="2.104847533s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.098950029 +0000 UTC m=+1116.696882558" watchObservedRunningTime="2026-03-20 13:48:57.104847533 +0000 UTC m=+1116.702780072" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.191257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.240132 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" path="/var/lib/kubelet/pods/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664/volumes" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.624289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.998035 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.034232 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038306 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038350 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038491 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lxknx" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.040915 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.046078 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.071958 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerID="1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.072033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.072947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerStarted","Data":"267e7baeb6290269d8531900c4aac9bc633ebbbaa20000911465b29c50a00f91"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.075599 4755 generic.go:334] "Generic (PLEG): container finished" podID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerID="0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.075678 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerDied","Data":"0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.077413 4755 generic.go:334] "Generic (PLEG): container finished" podID="0795b626-b382-4b9b-beb5-802cebc4f764" containerID="674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.077463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerDied","Data":"674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085461 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.086443 4755 generic.go:334] "Generic (PLEG): container finished" podID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerID="80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.086610 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerDied","Data":"80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.090298 4755 generic.go:334] "Generic (PLEG): container finished" podID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerID="9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.090330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerDied","Data":"9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.187860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188860 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188899 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188947 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:48:58.688927607 +0000 UTC m=+1118.286860146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.189451 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.189677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.190283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.192570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.209724 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.213376 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.497969 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.499901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.501939 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.502279 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.502872 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.511933 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595213 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697440 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697468 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697510 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697679 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698366 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698401 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698466 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:48:59.698443816 +0000 UTC m=+1119.296376365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.698733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.698769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.703809 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.709627 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.717039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.720573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.832893 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.097622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerStarted","Data":"0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141"} Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.100108 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.130876 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podStartSLOduration=3.130847056 podStartE2EDuration="3.130847056s" podCreationTimestamp="2026-03-20 13:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:59.125191068 +0000 UTC m=+1118.723123607" watchObservedRunningTime="2026-03-20 13:48:59.130847056 +0000 UTC m=+1118.728779585" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.304008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:59 crc kubenswrapper[4755]: W0320 13:48:59.323262 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f5042_e5e5_47a3_bc96_b504a0bf9af2.slice/crio-fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5 WatchSource:0}: Error finding container fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5: Status 404 returned error can't find the container with id fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5 Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.500136 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.620678 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.620832 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.622296 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af42784-d5cc-4f7c-832a-f91dbd54cc3f" (UID: "2af42784-d5cc-4f7c-832a-f91dbd54cc3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.628049 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb" (OuterVolumeSpecName: "kube-api-access-p5vhb") pod "2af42784-d5cc-4f7c-832a-f91dbd54cc3f" (UID: "2af42784-d5cc-4f7c-832a-f91dbd54cc3f"). InnerVolumeSpecName "kube-api-access-p5vhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.706900 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.708553 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.708574 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.708764 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.709258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.723069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723484 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723593 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723666 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:01.723633197 +0000 UTC m=+1121.321565726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.724025 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.724043 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.728950 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.730075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.740351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.745714 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"79c00857-0d6a-4c12-8581-da16e2a24f04\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825637 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"0795b626-b382-4b9b-beb5-802cebc4f764\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825704 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"79c00857-0d6a-4c12-8581-da16e2a24f04\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825754 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"0795b626-b382-4b9b-beb5-802cebc4f764\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79c00857-0d6a-4c12-8581-da16e2a24f04" (UID: "79c00857-0d6a-4c12-8581-da16e2a24f04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826361 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0795b626-b382-4b9b-beb5-802cebc4f764" (UID: "0795b626-b382-4b9b-beb5-802cebc4f764"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826402 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0587eb58-cd5e-4e0b-be30-97e0a569fc57" (UID: "0587eb58-cd5e-4e0b-be30-97e0a569fc57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826424 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826580 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826592 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826603 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.829059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz" (OuterVolumeSpecName: "kube-api-access-vnzpz") pod "0587eb58-cd5e-4e0b-be30-97e0a569fc57" (UID: "0587eb58-cd5e-4e0b-be30-97e0a569fc57"). InnerVolumeSpecName "kube-api-access-vnzpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.829645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6" (OuterVolumeSpecName: "kube-api-access-g9vg6") pod "0795b626-b382-4b9b-beb5-802cebc4f764" (UID: "0795b626-b382-4b9b-beb5-802cebc4f764"). InnerVolumeSpecName "kube-api-access-g9vg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.831866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9" (OuterVolumeSpecName: "kube-api-access-zmsc9") pod "79c00857-0d6a-4c12-8581-da16e2a24f04" (UID: "79c00857-0d6a-4c12-8581-da16e2a24f04"). InnerVolumeSpecName "kube-api-access-zmsc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865112 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865410 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865427 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865458 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865466 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865487 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865633 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865664 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865673 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.866133 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.874081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.905781 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.927884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.927969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928075 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928194 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928206 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.943687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.029571 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.029724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.030336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.045681 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.065076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerDied","Data":"fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108404 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108495 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerDied","Data":"cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126308 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126406 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.132732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerStarted","Data":"fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138355 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerDied","Data":"92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138392 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138415 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerDied","Data":"ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143875 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.198406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.410352 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.698393 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:49:00 crc kubenswrapper[4755]: W0320 13:49:00.705351 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d041c2_e231_49fd_9d88_a991a1b9dd65.slice/crio-730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103 WatchSource:0}: Error finding container 730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103: Status 404 returned error can't find the container with id 730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103 Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.156139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerStarted","Data":"730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103"} Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.157955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerStarted","Data":"de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1"} Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.447769 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.448967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.452924 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.477012 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.561563 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.561726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.666033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.666307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.668513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.689990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.768409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768701 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768845 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768929 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:05.768890323 +0000 UTC m=+1125.366822862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.781126 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:02 crc kubenswrapper[4755]: I0320 13:49:02.261737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:02 crc kubenswrapper[4755]: W0320 13:49:02.264882 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1b4f7f_9951_4976_b4e6_3222cc1ac6a2.slice/crio-35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0 WatchSource:0}: Error finding container 35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0: Status 404 returned error can't find the container with id 35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0 Mar 20 13:49:03 crc kubenswrapper[4755]: I0320 13:49:03.218556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerStarted","Data":"35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.230946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerStarted","Data":"d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.234584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerStarted","Data":"618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.236432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerStarted","Data":"ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.264492 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9jkxf" podStartSLOduration=3.264466559 podStartE2EDuration="3.264466559s" podCreationTimestamp="2026-03-20 13:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.257730103 +0000 UTC m=+1123.855662652" watchObservedRunningTime="2026-03-20 13:49:04.264466559 +0000 UTC m=+1123.862399118" Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.277470 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-35fe-account-create-update-h6fl8" podStartSLOduration=5.277444749 podStartE2EDuration="5.277444749s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.271235507 +0000 UTC m=+1123.869168066" watchObservedRunningTime="2026-03-20 13:49:04.277444749 +0000 UTC m=+1123.875377278" Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.291842 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pg2bq" podStartSLOduration=5.291821406 podStartE2EDuration="5.291821406s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.286848015 +0000 UTC m=+1123.884780544" watchObservedRunningTime="2026-03-20 13:49:04.291821406 +0000 UTC m=+1123.889753945" Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.245860 4755 generic.go:334] "Generic (PLEG): container finished" podID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerID="ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.245950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerDied","Data":"ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.248166 4755 generic.go:334] "Generic (PLEG): container finished" podID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerID="d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.248263 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerDied","Data":"d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.250450 4755 generic.go:334] "Generic (PLEG): container finished" podID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerID="618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.250491 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerDied","Data":"618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.858121 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.858314 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.859128 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.859335 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:13.859302234 +0000 UTC m=+1133.457234803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.757734 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.758547 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.779244 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.781759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"46d041c2-e231-49fd-9d88-a991a1b9dd65\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.781843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"46d041c2-e231-49fd-9d88-a991a1b9dd65\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.782598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46d041c2-e231-49fd-9d88-a991a1b9dd65" (UID: "46d041c2-e231-49fd-9d88-a991a1b9dd65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.789104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh" (OuterVolumeSpecName: "kube-api-access-cpsbh") pod "46d041c2-e231-49fd-9d88-a991a1b9dd65" (UID: "46d041c2-e231-49fd-9d88-a991a1b9dd65"). InnerVolumeSpecName "kube-api-access-cpsbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.884534 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.884570 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.119199 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.126071 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.188620 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"6fe77db3-29ef-42ae-840b-9736f07188ca\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.188895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"6fe77db3-29ef-42ae-840b-9736f07188ca\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189063 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fe77db3-29ef-42ae-840b-9736f07188ca" (UID: "6fe77db3-29ef-42ae-840b-9736f07188ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189808 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" (UID: "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.192938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.195155 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt" (OuterVolumeSpecName: "kube-api-access-tcgjt") pod "6fe77db3-29ef-42ae-840b-9736f07188ca" (UID: "6fe77db3-29ef-42ae-840b-9736f07188ca"). InnerVolumeSpecName "kube-api-access-tcgjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.195996 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6" (OuterVolumeSpecName: "kube-api-access-bdld6") pod "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" (UID: "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2"). InnerVolumeSpecName "kube-api-access-bdld6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.276834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerDied","Data":"de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277686 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277837 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-jb5zr" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" containerID="cri-o://ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" gracePeriod=10 Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.278166 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerDied","Data":"730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281866 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerDied","Data":"35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285730 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285796 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.289048 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerStarted","Data":"dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290681 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290710 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290720 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290729 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.326849 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j55xs" podStartSLOduration=1.717706781 podStartE2EDuration="9.326823975s" podCreationTimestamp="2026-03-20 13:48:58 +0000 UTC" firstStartedPulling="2026-03-20 13:48:59.326190413 +0000 UTC m=+1118.924122942" lastFinishedPulling="2026-03-20 13:49:06.935307607 +0000 UTC m=+1126.533240136" observedRunningTime="2026-03-20 13:49:07.316191857 +0000 UTC m=+1126.914124406" watchObservedRunningTime="2026-03-20 13:49:07.326823975 +0000 UTC m=+1126.924756514" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.783590 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.798035 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.800061 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.901835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.901996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.910978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr" (OuterVolumeSpecName: "kube-api-access-l4lhr") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "kube-api-access-l4lhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.943575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.947968 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.953013 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config" (OuterVolumeSpecName: "config") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.956985 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004138 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004169 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004179 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004190 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004198 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298324 4755 generic.go:334] "Generic (PLEG): container finished" podID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" exitCode=0 Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298414 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298416 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb"} Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298509 4755 scope.go:117] "RemoveContainer" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.318968 4755 scope.go:117] "RemoveContainer" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.359571 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.366084 4755 scope.go:117] "RemoveContainer" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.366714 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:08 crc kubenswrapper[4755]: E0320 13:49:08.367008 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": container with ID starting with ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24 not found: ID does not exist" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367066 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} err="failed to get container status \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": rpc error: code = NotFound desc = could not find container \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": container with ID starting with ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24 not found: ID does not exist" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367104 4755 scope.go:117] "RemoveContainer" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: E0320 13:49:08.367587 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": container with ID starting with d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367 not found: ID does not exist" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367633 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367"} err="failed to get container status \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": rpc error: code = NotFound desc = could not find container \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": container with ID starting with d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367 not found: ID does not exist" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.722296 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.237757 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" path="/var/lib/kubelet/pods/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa/volumes" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.239104 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" path="/var/lib/kubelet/pods/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2/volumes" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.970947 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971251 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971263 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971278 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971285 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971301 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971309 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971318 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971324 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971345 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="init" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971352 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="init" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971496 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971507 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.972047 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.975973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.985807 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.989214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.144011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245040 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245138 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.250170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.250338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.263290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.268836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.314468 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.861642 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.332143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerStarted","Data":"035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499"} Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.472478 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.473958 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.478606 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.483572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.581248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.581323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.682757 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.682834 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.683702 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.710813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.805686 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:12 crc kubenswrapper[4755]: I0320 13:49:12.960811 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.352342 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerStarted","Data":"cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.352385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerStarted","Data":"1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.355558 4755 generic.go:334] "Generic (PLEG): container finished" podID="c2ca344f-8f18-4dd9-9e5c-44669ff2da4f" containerID="8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d" exitCode=0 Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.355622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerDied","Data":"8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.920038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.927294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.970622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.368633 4755 generic.go:334] "Generic (PLEG): container finished" podID="6d21386c-8267-4dba-9028-d5cb729ff78b" containerID="d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.368735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerDied","Data":"d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.371114 4755 generic.go:334] "Generic (PLEG): container finished" podID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerID="dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.371164 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerDied","Data":"dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.373815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"97629bf77590f093931e8bac34f2e3412a85b42a926ed512dbd17949a8e3cb6d"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.374311 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.383883 4755 generic.go:334] "Generic (PLEG): container finished" podID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerID="cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.383932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerDied","Data":"cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.467733 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.945003195 podStartE2EDuration="1m5.467714821s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.5612671 +0000 UTC m=+1085.159199629" lastFinishedPulling="2026-03-20 13:48:40.083978726 +0000 UTC m=+1099.681911255" observedRunningTime="2026-03-20 13:49:14.46424564 +0000 UTC m=+1134.062178169" watchObservedRunningTime="2026-03-20 13:49:14.467714821 +0000 UTC m=+1134.065647350" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.566570 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.736882 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.827634 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kbcdp" podUID="408d869f-0966-4908-88e5-37cdff345c4a" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:49:14 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:49:14 crc kubenswrapper[4755]: > Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.848726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.849416 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.850254 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" (UID: "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.851182 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.855824 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2" (OuterVolumeSpecName: "kube-api-access-qszp2") pod "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" (UID: "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba"). InnerVolumeSpecName "kube-api-access-qszp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.857076 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.951583 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.951620 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097000 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:15 crc kubenswrapper[4755]: E0320 13:49:15.097569 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097585 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097842 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.098564 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.102029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.118745 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155562 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155592 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155621 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.257885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.259121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.259792 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261456 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261868 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.262080 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.263034 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.263208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.265504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.293327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.393862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"803b623182fc886142f88d5170663c1a27140b44d9bda1e4a361ee0d2fd977f2"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.399870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"9af8c44c5331144d77cbc2a2a9cdec67745083a2a8f79716ac8bf8ff5681077a"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.402375 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405733 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerDied","Data":"1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405869 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.417800 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.428497 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.402908463 podStartE2EDuration="1m6.428475913s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.378905262 +0000 UTC m=+1084.976837801" lastFinishedPulling="2026-03-20 13:48:40.404472722 +0000 UTC m=+1100.002405251" observedRunningTime="2026-03-20 13:49:15.423433122 +0000 UTC m=+1135.021365661" watchObservedRunningTime="2026-03-20 13:49:15.428475913 +0000 UTC m=+1135.026408442" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.004939 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088042 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.089710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090172 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090221 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.091721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.094488 4755 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.095336 4755 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.095216 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57" (OuterVolumeSpecName: "kube-api-access-fzc57") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "kube-api-access-fzc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.097341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.113477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.139904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts" (OuterVolumeSpecName: "scripts") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.146422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197463 4755 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197506 4755 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197517 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197534 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197548 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.420246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.426582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"25d5a4ff18b5184bc8816465e98810d024fc0341176edca6252e36aef172a9b3"} Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.433902 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.434331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerDied","Data":"fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5"} Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.434420 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5" Mar 20 13:49:16 crc kubenswrapper[4755]: W0320 13:49:16.446331 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb32031b_f717_4c38_817c_f9c84a6a50e5.slice/crio-701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8 WatchSource:0}: Error finding container 701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8: Status 404 returned error can't find the container with id 701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8 Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444152 4755 generic.go:334] "Generic (PLEG): container finished" podID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerID="bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerDied","Data":"bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerStarted","Data":"701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.450046 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"386f0d71882d58d04019ab933e4f5489c1d4439122b5590cef0983dded660199"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.450099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"aa7564708ab365be4808ac1e14360380b4b97d7a2bd7bce6b9eccb5e5ef9588e"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.772969 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.780085 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:19 crc kubenswrapper[4755]: I0320 13:49:19.238460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" path="/var/lib/kubelet/pods/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba/volumes" Mar 20 13:49:19 crc kubenswrapper[4755]: I0320 13:49:19.832632 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kbcdp" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790247 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:22 crc kubenswrapper[4755]: E0320 13:49:22.790618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790637 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790885 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.791814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.795048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.809120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.826736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.827290 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.929118 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.929191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.930429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.953203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:23 crc kubenswrapper[4755]: I0320 13:49:23.123802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.325972 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364175 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364291 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364384 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364449 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run" (OuterVolumeSpecName: "var-run") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365008 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365519 4755 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365560 4755 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365581 4755 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365604 4755 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.366046 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts" (OuterVolumeSpecName: "scripts") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.372169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp" (OuterVolumeSpecName: "kube-api-access-qwgzp") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "kube-api-access-qwgzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.467603 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.468083 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.533981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerDied","Data":"701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8"} Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.534463 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.534260 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.864363 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.459099 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.467332 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546148 4755 generic.go:334] "Generic (PLEG): container finished" podID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerID="b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0" exitCode=0 Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerDied","Data":"b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerStarted","Data":"73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.552725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"bb6427f350f857361e734cc6bbf1571e294e4ed9d2f52827d03d397adb9aac6b"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.558853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerStarted","Data":"357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.590841 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w78rr" podStartSLOduration=2.986029478 podStartE2EDuration="17.590816844s" podCreationTimestamp="2026-03-20 13:49:09 +0000 UTC" firstStartedPulling="2026-03-20 13:49:10.871080796 +0000 UTC m=+1130.469013345" lastFinishedPulling="2026-03-20 13:49:25.475868182 +0000 UTC m=+1145.073800711" observedRunningTime="2026-03-20 13:49:26.586016528 +0000 UTC m=+1146.183949057" watchObservedRunningTime="2026-03-20 13:49:26.590816844 +0000 UTC m=+1146.188749373" Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.259241 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" path="/var/lib/kubelet/pods/eb32031b-f717-4c38-817c-f9c84a6a50e5/volumes" Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.570846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"db511d929300fed522174fdba449c70ea7596737d488551f4e2f059cc273bbe3"} Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.900787 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.016463 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"8ae45e95-b96a-4157-a584-a6eb321d5091\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.016838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"8ae45e95-b96a-4157-a584-a6eb321d5091\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.018027 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ae45e95-b96a-4157-a584-a6eb321d5091" (UID: "8ae45e95-b96a-4157-a584-a6eb321d5091"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.022535 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv" (OuterVolumeSpecName: "kube-api-access-9ncbv") pod "8ae45e95-b96a-4157-a584-a6eb321d5091" (UID: "8ae45e95-b96a-4157-a584-a6eb321d5091"). InnerVolumeSpecName "kube-api-access-9ncbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.118608 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.118923 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581396 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerDied","Data":"73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581466 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581490 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587583 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"74b93bc77d9e03ce70262c37ca57793e0e8511f606820a6bf55e586e8e0cd619"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"2a9d8e97b92781ea960da33f5a3c070691cc4640898651227da318e9557b8dd0"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587630 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"4ac3d8248ee41af2077062af84da816bddfd837378ae9df44f304fc24e37aa43"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.616269 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"af20658b5b931396ba82dcef7b91e5c229663e6cf674739e4f96f3c62e654d76"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"5c0a8a5d65d078c7c6da6d1783e1d0b5e3153a501b388c51d47c6e4cafe3dab3"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"fa752aed9c1f5e29f08081bb21ff47476575077b9084bdf5e56a291b588db25f"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"66d618d73f3d742120b395cdbc6b49c6a661d26e4cb32676067b85d7a8f031ff"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.867871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.290901 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"45cd9d5f5e6ca55047b5b4ecba8c9ba341089dc30300d658188fe0fa39e785c3"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"095e77259823fa84683c79c5bd2e9aef61a1906b715026c515fe0b6d5f47296a"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"d0ca7024be80702ea50db5457c27fdf54d60676f49b089e45f727e61586b553c"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.725349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.652563965 podStartE2EDuration="35.725324972s" podCreationTimestamp="2026-03-20 13:48:56 +0000 UTC" firstStartedPulling="2026-03-20 13:49:14.598680932 +0000 UTC m=+1134.196613461" lastFinishedPulling="2026-03-20 13:49:29.671441909 +0000 UTC m=+1149.269374468" observedRunningTime="2026-03-20 13:49:31.71990418 +0000 UTC m=+1151.317836739" watchObservedRunningTime="2026-03-20 13:49:31.725324972 +0000 UTC m=+1151.323257501" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.826399 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:31 crc kubenswrapper[4755]: E0320 13:49:31.827058 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: E0320 13:49:31.827113 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827383 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827414 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.828224 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.836174 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.842109 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.843801 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.868564 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.880749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.892007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.892099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.984698 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.988055 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994474 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.995631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.996406 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.019054 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.020606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.053970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.064803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.118341 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.121409 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.123368 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.140901 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.154824 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.160283 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.169102 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.169557 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.187723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.188645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200401 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200518 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200615 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.201894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.202391 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.226122 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.241446 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.253635 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.300748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.301959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302023 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302177 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.303419 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.303489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.304198 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.305781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.305929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.306605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.310163 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.310770 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.311346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.312801 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.318005 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.319887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.330599 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.332110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.343901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.346303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.348315 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.363850 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.404066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405351 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405445 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405826 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.407394 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.465847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.488106 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506933 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.507004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.507064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.509612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.514434 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.518248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.518591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.531947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.535187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.616258 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.661824 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.677905 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.706897 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerStarted","Data":"ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9"} Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.755423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.971971 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.999249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.009965 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.103521 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.454310 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:33 crc kubenswrapper[4755]: W0320 13:49:33.468867 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c85756_25cf_4302_bd5d_72f2e459f562.slice/crio-b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00 WatchSource:0}: Error finding container b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00: Status 404 returned error can't find the container with id b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.472319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:33 crc kubenswrapper[4755]: W0320 13:49:33.479232 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ad8e64_0606_4171_bd2d_ae8212fdff8f.slice/crio-ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21 WatchSource:0}: Error finding container ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21: Status 404 returned error can't find the container with id ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.716562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerStarted","Data":"ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719165 4755 generic.go:334] "Generic (PLEG): container finished" podID="e38d31ac-eae6-4cd1-be04-304215db852a" containerID="c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerDied","Data":"c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerStarted","Data":"842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.726215 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerStarted","Data":"b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727493 4755 generic.go:334] "Generic (PLEG): container finished" podID="feb55e83-711d-4561-8b57-2a231944e1b1" containerID="705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerDied","Data":"705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727560 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerStarted","Data":"bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729037 4755 generic.go:334] "Generic (PLEG): container finished" podID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerID="035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729089 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerDied","Data":"035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729104 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerStarted","Data":"7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.730129 4755 generic.go:334] "Generic (PLEG): container finished" podID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerID="02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.730184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerDied","Data":"02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731388 4755 generic.go:334] "Generic (PLEG): container finished" podID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerID="60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731470 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerDied","Data":"60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerStarted","Data":"040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732561 4755 generic.go:334] "Generic (PLEG): container finished" podID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerStarted","Data":"61e599f2b06be42605f3f5420cbd417da830635ec8940109cba58f789bbd856c"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.743818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerStarted","Data":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.745741 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.747862 4755 generic.go:334] "Generic (PLEG): container finished" podID="34c85756-25cf-4302-bd5d-72f2e459f562" containerID="cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2" exitCode=0 Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.748008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerDied","Data":"cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.751484 4755 generic.go:334] "Generic (PLEG): container finished" podID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerID="357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56" exitCode=0 Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.751535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerDied","Data":"357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.771187 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" podStartSLOduration=2.771168695 podStartE2EDuration="2.771168695s" podCreationTimestamp="2026-03-20 13:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:34.764032317 +0000 UTC m=+1154.361964846" watchObservedRunningTime="2026-03-20 13:49:34.771168695 +0000 UTC m=+1154.369101224" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.144541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.210425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.210587 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.214586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c5d05dc-a589-4d2e-9374-0d57202a3cfc" (UID: "8c5d05dc-a589-4d2e-9374-0d57202a3cfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.218562 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7" (OuterVolumeSpecName: "kube-api-access-n5xn7") pod "8c5d05dc-a589-4d2e-9374-0d57202a3cfc" (UID: "8c5d05dc-a589-4d2e-9374-0d57202a3cfc"). InnerVolumeSpecName "kube-api-access-n5xn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.285856 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.291338 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.302448 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.309833 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.314349 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.314395 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"feb55e83-711d-4561-8b57-2a231944e1b1\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416435 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"e38d31ac-eae6-4cd1-be04-304215db852a\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"e38d31ac-eae6-4cd1-be04-304215db852a\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"5dde547e-5fce-4868-ba0e-63650ea0c771\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416626 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"5dde547e-5fce-4868-ba0e-63650ea0c771\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416693 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"feb55e83-711d-4561-8b57-2a231944e1b1\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416860 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417441 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e38d31ac-eae6-4cd1-be04-304215db852a" (UID: "e38d31ac-eae6-4cd1-be04-304215db852a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417576 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dde547e-5fce-4868-ba0e-63650ea0c771" (UID: "5dde547e-5fce-4868-ba0e-63650ea0c771"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "feb55e83-711d-4561-8b57-2a231944e1b1" (UID: "feb55e83-711d-4561-8b57-2a231944e1b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "015c8ae7-1856-4b0c-b5ce-e2503a2080dc" (UID: "015c8ae7-1856-4b0c-b5ce-e2503a2080dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.421447 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk" (OuterVolumeSpecName: "kube-api-access-frkpk") pod "e38d31ac-eae6-4cd1-be04-304215db852a" (UID: "e38d31ac-eae6-4cd1-be04-304215db852a"). InnerVolumeSpecName "kube-api-access-frkpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.422328 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5" (OuterVolumeSpecName: "kube-api-access-82wd5") pod "feb55e83-711d-4561-8b57-2a231944e1b1" (UID: "feb55e83-711d-4561-8b57-2a231944e1b1"). InnerVolumeSpecName "kube-api-access-82wd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.423759 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p" (OuterVolumeSpecName: "kube-api-access-wlq6p") pod "015c8ae7-1856-4b0c-b5ce-e2503a2080dc" (UID: "015c8ae7-1856-4b0c-b5ce-e2503a2080dc"). InnerVolumeSpecName "kube-api-access-wlq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.423790 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7" (OuterVolumeSpecName: "kube-api-access-98tb7") pod "5dde547e-5fce-4868-ba0e-63650ea0c771" (UID: "5dde547e-5fce-4868-ba0e-63650ea0c771"). InnerVolumeSpecName "kube-api-access-98tb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520113 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520157 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520170 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520179 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520189 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520206 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520219 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.766934 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.766917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerDied","Data":"bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.767116 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770357 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerDied","Data":"7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770638 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerDied","Data":"ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772560 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerDied","Data":"040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774402 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774402 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerDied","Data":"842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776187 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776293 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.751621 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.752322 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.752422 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.754435 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.754733 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" gracePeriod=600 Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840206 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" exitCode=0 Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840321 4755 scope.go:117] "RemoveContainer" containerID="d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.584013 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.591146 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"34c85756-25cf-4302-bd5d-72f2e459f562\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735440 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"34c85756-25cf-4302-bd5d-72f2e459f562\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735544 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.736923 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34c85756-25cf-4302-bd5d-72f2e459f562" (UID: "34c85756-25cf-4302-bd5d-72f2e459f562"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.743360 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2" (OuterVolumeSpecName: "kube-api-access-sxsh2") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "kube-api-access-sxsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.743368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.744283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7" (OuterVolumeSpecName: "kube-api-access-99pj7") pod "34c85756-25cf-4302-bd5d-72f2e459f562" (UID: "34c85756-25cf-4302-bd5d-72f2e459f562"). InnerVolumeSpecName "kube-api-access-99pj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.773045 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.794977 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data" (OuterVolumeSpecName: "config-data") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837472 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837501 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837511 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837520 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837529 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837538 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerDied","Data":"b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00"} Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854535 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854618 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerDied","Data":"035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499"} Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862849 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862917 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.878753 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.881956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerStarted","Data":"291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6"} Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.939592 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9xrbx" podStartSLOduration=2.833014065 podStartE2EDuration="7.939564659s" podCreationTimestamp="2026-03-20 13:49:32 +0000 UTC" firstStartedPulling="2026-03-20 13:49:33.490335216 +0000 UTC m=+1153.088267745" lastFinishedPulling="2026-03-20 13:49:38.59688579 +0000 UTC m=+1158.194818339" observedRunningTime="2026-03-20 13:49:39.931328493 +0000 UTC m=+1159.529261042" watchObservedRunningTime="2026-03-20 13:49:39.939564659 +0000 UTC m=+1159.537497198" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.170704 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.171091 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" containerID="cri-o://ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" gracePeriod=10 Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.182067 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.240750 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241859 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241878 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241894 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241900 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241924 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241931 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241940 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241947 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241958 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241964 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241994 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242004 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.242018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242025 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242230 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242243 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242262 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242271 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242290 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242307 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242317 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.243607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.253287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369750 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370419 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.473872 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.473945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474044 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474113 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.477348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.479303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.502378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.628938 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.750062 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.781793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782085 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782328 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.800945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg" (OuterVolumeSpecName: "kube-api-access-jq7rg") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "kube-api-access-jq7rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.829039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.848424 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.853458 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.868553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.883980 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config" (OuterVolumeSpecName: "config") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884272 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884298 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884315 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884327 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884337 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884350 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.899359 4755 generic.go:334] "Generic (PLEG): container finished" podID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" exitCode=0 Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.900029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.906024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"61e599f2b06be42605f3f5420cbd417da830635ec8940109cba58f789bbd856c"} Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.906773 4755 scope.go:117] "RemoveContainer" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.907370 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.977796 4755 scope.go:117] "RemoveContainer" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.003931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.019018 4755 scope.go:117] "RemoveContainer" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:41 crc kubenswrapper[4755]: E0320 13:49:41.020604 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": container with ID starting with ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11 not found: ID does not exist" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.020646 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} err="failed to get container status \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": rpc error: code = NotFound desc = could not find container \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": container with ID starting with ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11 not found: ID does not exist" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.020685 4755 scope.go:117] "RemoveContainer" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: E0320 13:49:41.021598 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": container with ID starting with 27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788 not found: ID does not exist" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.021697 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788"} err="failed to get container status \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": rpc error: code = NotFound desc = could not find container \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": container with ID starting with 27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788 not found: ID does not exist" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.021830 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.142693 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.239932 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" path="/var/lib/kubelet/pods/87a11166-3f5f-4f57-a8ba-19f88c636ee7/volumes" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.913675 4755 generic.go:334] "Generic (PLEG): container finished" podID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerID="e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc" exitCode=0 Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.913802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc"} Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.914223 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerStarted","Data":"35342b414d35d91006e710a0dacee45f33a514dab176d4298d187fb90fe3be69"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.926507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerStarted","Data":"2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.926820 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.928999 4755 generic.go:334] "Generic (PLEG): container finished" podID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerID="291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6" exitCode=0 Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.929053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerDied","Data":"291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.958046 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" podStartSLOduration=2.958025964 podStartE2EDuration="2.958025964s" podCreationTimestamp="2026-03-20 13:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:42.951847632 +0000 UTC m=+1162.549780202" watchObservedRunningTime="2026-03-20 13:49:42.958025964 +0000 UTC m=+1162.555958483" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.383244 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409376 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.444969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7" (OuterVolumeSpecName: "kube-api-access-2k6t7") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "kube-api-access-2k6t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.472996 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.509352 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data" (OuterVolumeSpecName: "config-data") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520172 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520227 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520245 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950471 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerDied","Data":"ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21"} Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950520 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950599 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.285850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.286827 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" containerID="cri-o://2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" gracePeriod=10 Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.333243 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334160 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334292 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334367 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="init" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334416 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="init" Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334491 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334552 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334816 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334897 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.335743 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339325 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339686 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339819 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.341097 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.343317 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.347481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.349751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.367869 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.374476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441735 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.442010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.442046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544241 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.548984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558207 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.559695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.560142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.560430 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.561724 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.570558 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.573897 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.575576 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.582018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.588839 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.589294 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.589770 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8gg2p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.590453 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.597058 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.598129 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603303 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603379 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4kqwk" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.616484 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.629737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.641361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645549 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645594 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645719 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645773 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.658061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.679794 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.684932 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.721428 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.727498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732635 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ncc5q" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751821 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752047 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.754338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.755239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.759936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.774040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.774251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.800848 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.813672 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.820206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.823092 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.827499 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.828752 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.829441 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.829635 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t52g6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.830070 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.831850 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nvndn" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.834615 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864191 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864465 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864913 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.867498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.871623 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.871733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.892093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.905371 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.905453 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.947246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.963228 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986131 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986735 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.987001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.987093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.002062 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.003296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.006493 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.009279 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.010153 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.010822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.014169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.015779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.019306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.032822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.033894 4755 generic.go:334] "Generic (PLEG): container finished" podID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerID="2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" exitCode=0 Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.034113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9"} Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.054379 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.056475 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.056548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.075880 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.077496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.089827 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090129 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090221 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.098185 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.114500 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.160979 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183190 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183241 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.186481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187299 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187563 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187602 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.191326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.191747 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.199477 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207606 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207800 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207968 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208360 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.209524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.209725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210759 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.212831 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.212910 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.215882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.216350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.218058 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.222479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.228676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.257535 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.261520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311365 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311844 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311919 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311945 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311985 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312036 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414646 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.417773 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.421182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.424197 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.425958 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.430496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.431971 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.433253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.434897 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.435895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.435943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.437041 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.439333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.439388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.452957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.454353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.482160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.494266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.518744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.520919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.544508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.617888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.625443 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721863 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721909 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721984 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.722150 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.728278 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p" (OuterVolumeSpecName: "kube-api-access-69w7p") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "kube-api-access-69w7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.813956 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: E0320 13:49:46.814580 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="init" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="init" Mar 20 13:49:46 crc kubenswrapper[4755]: E0320 13:49:46.814635 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814644 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814924 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.827316 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.831514 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.833274 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.834797 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.836294 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.841196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.844972 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.859489 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config" (OuterVolumeSpecName: "config") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.866375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.879280 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.914126 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935490 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935516 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935532 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935546 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935555 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.936683 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.947430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:46 crc kubenswrapper[4755]: W0320 13:49:46.951602 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e3b9192_0e1c_4c85_82de_3a54a4272c48.slice/crio-bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5 WatchSource:0}: Error finding container bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5: Status 404 returned error can't find the container with id bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5 Mar 20 13:49:46 crc kubenswrapper[4755]: W0320 13:49:46.955267 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69707be4_e338_4e13_8ecc_8cfd7cd416b2.slice/crio-312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997 WatchSource:0}: Error finding container 312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997: Status 404 returned error can't find the container with id 312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997 Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042837 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042878 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.043369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.043410 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.045991 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.050114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.051378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.056878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.057417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.075233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.076053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"35342b414d35d91006e710a0dacee45f33a514dab176d4298d187fb90fe3be69"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.076134 4755 scope.go:117] "RemoveContainer" containerID="2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.078990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.094439 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerStarted","Data":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.095000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerStarted","Data":"187f0aeadf4e124f27492aa0d9af1cc05ae52352a5bbdb94c65aff34e13e285c"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.101792 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c6c7fc7-qs6b6" event={"ID":"7e3b9192-0e1c-4c85-82de-3a54a4272c48","Type":"ContainerStarted","Data":"bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.103040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerStarted","Data":"7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.123726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerStarted","Data":"312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.124033 4755 scope.go:117] "RemoveContainer" containerID="e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.147023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: W0320 13:49:47.163384 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ea35a84_68ca_4490_b1d9_fa999ef63ebe.slice/crio-7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662 WatchSource:0}: Error finding container 7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662: Status 404 returned error can't find the container with id 7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662 Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.169747 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.175542 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.212211 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.254972 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.343043 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.373149 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.429770 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:47 crc kubenswrapper[4755]: W0320 13:49:47.528931 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e WatchSource:0}: Error finding container aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e: Status 404 returned error can't find the container with id aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.530288 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.578749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.814433 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862458 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862791 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.881823 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6" (OuterVolumeSpecName: "kube-api-access-gn7v6") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "kube-api-access-gn7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.899784 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.914444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.953462 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config" (OuterVolumeSpecName: "config") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.955287 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966225 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966578 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966617 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966631 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966642 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966773 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966785 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.993005 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.086138 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: W0320 13:49:48.140361 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d02dff6_d832_40b7_8291_f7f08be96659.slice/crio-c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c WatchSource:0}: Error finding container c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c: Status 404 returned error can't find the container with id c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.147374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerStarted","Data":"7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.149813 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"fa39cf0d73d745c483cbf46583864f4385b896f58bc8b0da2a658f1a87cd2c55"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153109 4755 generic.go:334] "Generic (PLEG): container finished" podID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" exitCode=0 Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerDied","Data":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerDied","Data":"187f0aeadf4e124f27492aa0d9af1cc05ae52352a5bbdb94c65aff34e13e285c"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153197 4755 scope.go:117] "RemoveContainer" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153267 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.163817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerStarted","Data":"bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.171349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.187500 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerStarted","Data":"39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.193476 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerStarted","Data":"46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.199861 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7d8ml" podStartSLOduration=3.199840168 podStartE2EDuration="3.199840168s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.185504537 +0000 UTC m=+1167.783437136" watchObservedRunningTime="2026-03-20 13:49:48.199840168 +0000 UTC m=+1167.797772697" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.210262 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerStarted","Data":"88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.233128 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.244544 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.245385 4755 scope.go:117] "RemoveContainer" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: E0320 13:49:48.247138 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": container with ID starting with 725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea not found: ID does not exist" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.247190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} err="failed to get container status \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": rpc error: code = NotFound desc = could not find container \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": container with ID starting with 725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea not found: ID does not exist" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.318971 4755 generic.go:334] "Generic (PLEG): container finished" podID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerID="e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295" exitCode=0 Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.319063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.319107 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerStarted","Data":"f4cea975e04082628cd5787018d084f226541b1327d170cee0f4b957229de5d6"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.347988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerStarted","Data":"935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.362251 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-52m67" podStartSLOduration=3.362231364 podStartE2EDuration="3.362231364s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.280424664 +0000 UTC m=+1167.878357193" watchObservedRunningTime="2026-03-20 13:49:48.362231364 +0000 UTC m=+1167.960163893" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.781312 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.957458 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.974376 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.070323 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120330 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:49 crc kubenswrapper[4755]: E0320 13:49:49.120737 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120753 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120923 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.121826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.149458 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.247320 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" path="/var/lib/kubelet/pods/1111c2ae-21ad-47b4-9ec0-e51d507a864e/volumes" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.248139 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" path="/var/lib/kubelet/pods/4132d383-5c0f-4d4f-9622-c0e5c41d6568/volumes" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.381148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerStarted","Data":"70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f"} Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.381235 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.386455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c"} Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.409346 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podStartSLOduration=4.409326772 podStartE2EDuration="4.409326772s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:49.405623336 +0000 UTC m=+1169.003555865" watchObservedRunningTime="2026-03-20 13:49:49.409326772 +0000 UTC m=+1169.007259301" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.416689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.417125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.417331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.418108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.418184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.419732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.421011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.421173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.427357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.441162 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.458196 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.137818 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:50 crc kubenswrapper[4755]: W0320 13:49:50.161837 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827 WatchSource:0}: Error finding container c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827: Status 404 returned error can't find the container with id c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827 Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.416029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8"} Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.429236 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c"} Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.432074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446003 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446119 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" containerID="cri-o://64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446180 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" containerID="cri-o://d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449004 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449349 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" containerID="cri-o://6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449366 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" containerID="cri-o://54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.485917 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.485879735 podStartE2EDuration="6.485879735s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:51.478273209 +0000 UTC m=+1171.076205738" watchObservedRunningTime="2026-03-20 13:49:51.485879735 +0000 UTC m=+1171.083812264" Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.515332 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.515313643 podStartE2EDuration="6.515313643s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:51.503580931 +0000 UTC m=+1171.101513460" watchObservedRunningTime="2026-03-20 13:49:51.515313643 +0000 UTC m=+1171.113246172" Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.480178 4755 generic.go:334] "Generic (PLEG): container finished" podID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerID="bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.480262 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerDied","Data":"bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486123 4755 generic.go:334] "Generic (PLEG): container finished" podID="1d02dff6-d832-40b7-8291-f7f08be96659" containerID="54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486160 4755 generic.go:334] "Generic (PLEG): container finished" podID="1d02dff6-d832-40b7-8291-f7f08be96659" containerID="6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" exitCode=143 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499095 4755 generic.go:334] "Generic (PLEG): container finished" podID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerID="d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499156 4755 generic.go:334] "Generic (PLEG): container finished" podID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerID="64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" exitCode=143 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c"} Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.342721 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.402126 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.404818 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.408738 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.447243 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480446 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480598 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480941 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.527953 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.543771 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.546197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.557052 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583809 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584165 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.587802 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.591096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.593346 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.610299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.617573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.622229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.623156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.687676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688710 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689881 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.690740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.692789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.693297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.707292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.710326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.747897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.868646 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.432210 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.502160 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.502411 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" containerID="cri-o://0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" gracePeriod=10 Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.192771 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.569251 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerID="0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" exitCode=0 Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.569332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141"} Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.136209 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.138442 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147247 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147573 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147795 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.149632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.312002 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.414132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.434191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.467584 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:02 crc kubenswrapper[4755]: I0320 13:50:02.192366 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:07 crc kubenswrapper[4755]: I0320 13:50:07.191863 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:07 crc kubenswrapper[4755]: I0320 13:50:07.192523 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.837608 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.838177 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m97kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-cxr9p_openstack(7ea35a84-68ca-4490-b1d9-fa999ef63ebe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.839638 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-cxr9p" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.138098 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.138421 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbbh67hfh64fhc6h694hb8h5c7h58fh6bh5f5h58bh5b8h5d9h5d7h65dh688h65bh678h598h95h668hb7hb9h6fh67dh555h69hbbh5f8h665h555q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxd7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ded8942b-87a3-49fa-80fb-dc830c09f18d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.205467 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.205637 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58h8ch55fh84h94hf8h96h5h594h564h694h594h99h85hb7h564h647h66dh576hd6h597hb9h659hf5h65ch5c9h8fh5f4h66fhf6h55ch567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjbkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8c6c7fc7-qs6b6_openstack(7e3b9192-0e1c-4c85-82de-3a54a4272c48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.208070 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8c6c7fc7-qs6b6" podUID="7e3b9192-0e1c-4c85-82de-3a54a4272c48" Mar 20 13:50:10 crc kubenswrapper[4755]: I0320 13:50:10.693212 4755 generic.go:334] "Generic (PLEG): container finished" podID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerID="46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967" exitCode=0 Mar 20 13:50:10 crc kubenswrapper[4755]: I0320 13:50:10.693308 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerDied","Data":"46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967"} Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.695869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-cxr9p" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.192817 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:12 crc kubenswrapper[4755]: E0320 13:50:12.288301 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:50:12 crc kubenswrapper[4755]: E0320 13:50:12.288482 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h574h5ffh57bh675hd4h5f4hf4h565h67fh696h96h58fh697hd7h89h5dbh86h5f9h545h5d4h5c7h5c6h5c5hd7h5cfh94h549h676h7bh657h668q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv45n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-74d5b88dcf-ftnlg_openstack(2f75cbbe-c852-4090-aca4-42cd87a3a9b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.373527 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500427 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500590 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.507911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.509906 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts" (OuterVolumeSpecName: "scripts") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.527897 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh" (OuterVolumeSpecName: "kube-api-access-jr7jh") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "kube-api-access-jr7jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.533851 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.538194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.561109 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data" (OuterVolumeSpecName: "config-data") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603149 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603187 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603207 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603214 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603222 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerDied","Data":"7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6"} Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721179 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721297 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.469756 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.481683 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.570604 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.571111 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571134 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571303 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571914 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573666 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573980 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.574820 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.582785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.592153 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625571 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625796 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.728065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.728089 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.735524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.736541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.745523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.880845 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.881057 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fl4vg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jrf8c_openstack(25bd1da4-7fdb-4bd9-8405-a37fc6c18be0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.883937 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jrf8c" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.901521 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.977557 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.990468 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:13.998630 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.006105 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036161 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs" (OuterVolumeSpecName: "logs") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036997 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037223 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data" (OuterVolumeSpecName: "config-data") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038241 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038255 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts" (OuterVolumeSpecName: "scripts") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.079522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.085467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf" (OuterVolumeSpecName: "kube-api-access-tjbkf") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "kube-api-access-tjbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139669 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139733 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139764 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139850 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139920 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140014 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140067 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140139 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140195 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140293 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs" (OuterVolumeSpecName: "logs") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140563 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140580 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140593 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140603 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.143341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.146701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts" (OuterVolumeSpecName: "scripts") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.147303 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.148116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl" (OuterVolumeSpecName: "kube-api-access-26ffl") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "kube-api-access-26ffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.149416 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68" (OuterVolumeSpecName: "kube-api-access-8kh68") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "kube-api-access-8kh68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.149800 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs" (OuterVolumeSpecName: "logs") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.150048 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.150184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj" (OuterVolumeSpecName: "kube-api-access-6l4tj") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "kube-api-access-6l4tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.155868 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.167629 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts" (OuterVolumeSpecName: "scripts") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.175341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.182829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.187062 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.195339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config" (OuterVolumeSpecName: "config") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.199884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data" (OuterVolumeSpecName: "config-data") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.200339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.205747 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data" (OuterVolumeSpecName: "config-data") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.218091 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242075 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242132 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242142 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242153 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242163 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242173 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242182 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242190 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242198 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242205 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242218 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242227 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242236 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242247 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242257 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242271 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242281 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242289 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.268608 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.274625 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.344969 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.345034 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.544821 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.552701 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.552956 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8zjs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dtggj_openstack(95c76f8c-7b76-4714-adac-6297b84d6492): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.554235 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dtggj" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651161 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651407 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651509 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651530 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651578 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.659449 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4" (OuterVolumeSpecName: "kube-api-access-v57f4") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "kube-api-access-v57f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.696025 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.696697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.698081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config" (OuterVolumeSpecName: "config") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.718881 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746860 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerDied","Data":"312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746947 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749424 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749844 4755 scope.go:117] "RemoveContainer" containerID="54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753005 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753028 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753039 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753049 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753071 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.761000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"fa39cf0d73d745c483cbf46583864f4385b896f58bc8b0da2a658f1a87cd2c55"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.761112 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.775328 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"267e7baeb6290269d8531900c4aac9bc633ebbbaa20000911465b29c50a00f91"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.775390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.778400 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.778514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c6c7fc7-qs6b6" event={"ID":"7e3b9192-0e1c-4c85-82de-3a54a4272c48","Type":"ContainerDied","Data":"bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5"} Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.780987 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jrf8c" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.781098 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-dtggj" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.864257 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.874638 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.880098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.887184 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.924669 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.943237 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.943948 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944034 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944104 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="init" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944165 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="init" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944228 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944278 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944344 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944409 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944480 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944540 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944612 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944684 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944753 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944809 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945008 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945073 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945134 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945206 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945262 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945320 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.946268 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949057 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949294 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.950508 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.950778 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.971616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.996913 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.004924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.011394 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.011983 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.037307 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.044512 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.051645 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.071636 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.085458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.085892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.088385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.088720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.090158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.090855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091258 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091421 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091616 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.092414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.092557 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.195433 4755 scope.go:117] "RemoveContainer" containerID="6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199052 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199179 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200104 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200708 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200898 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.202455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.203991 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.208427 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.208709 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.209219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.217119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.220768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.221042 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.239095 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.256981 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.267139 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" path="/var/lib/kubelet/pods/1d02dff6-d832-40b7-8291-f7f08be96659/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.271112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272190 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272287 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3b9192-0e1c-4c85-82de-3a54a4272c48" path="/var/lib/kubelet/pods/7e3b9192-0e1c-4c85-82de-3a54a4272c48/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272983 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" path="/var/lib/kubelet/pods/89ac3c10-1912-4807-a62e-d91f5e5682b4/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.282542 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" path="/var/lib/kubelet/pods/9f832429-63c8-4af9-b0ed-26e3f989125c/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.283918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.313942 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" path="/var/lib/kubelet/pods/f52d787e-af63-491a-a3f9-2a9626a9f8b8/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.328693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.337779 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.380383 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.383046 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.387868 4755 scope.go:117] "RemoveContainer" containerID="d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.428058 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.443941 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.446362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.450374 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.450692 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4kqwk" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.451021 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.460404 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.476095 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506192 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506291 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506349 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506392 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.512917 4755 scope.go:117] "RemoveContainer" containerID="64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.550498 4755 scope.go:117] "RemoveContainer" containerID="0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.572956 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610164 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610319 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.611920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.611961 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.612485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.612560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.613264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.614624 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.622715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.625554 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.633832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634833 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: W0320 13:50:15.651637 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74e82d0_07c7_4a72_baa4_9ec1e8427b5f.slice/crio-7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2 WatchSource:0}: Error finding container 7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2: Status 404 returned error can't find the container with id 7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2 Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.703947 4755 scope.go:117] "RemoveContainer" containerID="1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.718637 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:50:15 crc kubenswrapper[4755]: E0320 13:50:15.745052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-74d5b88dcf-ftnlg" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.745808 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.796851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.816394 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"5bdd7e139d44ce5556e1d3d02993b6d72f9014d19e2934c54cab87ef83fcc71a"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.819455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerStarted","Data":"44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.819585 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74d5b88dcf-ftnlg" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" containerID="cri-o://44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" gracePeriod=30 Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.831369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"fd011c06f3fffccd2ebc454db1a10f42c4b31b9cc3cdee3a458a0730af40410b"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.856114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.858565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerStarted","Data":"7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.972549 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.283319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.456092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.509028 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.645125 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.735866 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.890849 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerStarted","Data":"df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.891275 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerStarted","Data":"a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.909180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"2a8c9089e1e7047efefde5df6ddda04cf4071fad9b8adaef75c6dc167f6e724d"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.918829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rwsvb" podStartSLOduration=3.918647472 podStartE2EDuration="3.918647472s" podCreationTimestamp="2026-03-20 13:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:16.910909244 +0000 UTC m=+1196.508841773" watchObservedRunningTime="2026-03-20 13:50:16.918647472 +0000 UTC m=+1196.516580001" Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.976108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"7c00d429c587fdfb6997848606a9ce03e91831e8523d4f6e66667f14705ac4e2"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.976159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"2d33ff4239924776fca486654778ae26f786d68ec2a167824204dbc679fc4090"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.982235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"c78877e8c0818f15b0f2e1b6adc8bc55f8f067e988d6b39df713c8a50ee71484"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.992579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.000559 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.003741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"e2bc4a76cdde92278100366b804ccffd9de5944afb2aabb9539badf236736c8a"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014421 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014739 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fb9d46f97-rdkvb" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" containerID="cri-o://be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" gracePeriod=30 Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014914 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fb9d46f97-rdkvb" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" containerID="cri-o://4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" gracePeriod=30 Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.015984 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9f7d4c74d-t7tpq" podStartSLOduration=23.015954331 podStartE2EDuration="23.015954331s" podCreationTimestamp="2026-03-20 13:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:17.005125651 +0000 UTC m=+1196.603058190" watchObservedRunningTime="2026-03-20 13:50:17.015954331 +0000 UTC m=+1196.613886860" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.041273 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerStarted","Data":"973fbf181b3cd53cfc9d88b983db4e84c26d3e70cf15d7f503f2e3da897707a2"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.054588 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54fd48b444-c4c9l" podStartSLOduration=23.054562286 podStartE2EDuration="23.054562286s" podCreationTimestamp="2026-03-20 13:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:17.038640475 +0000 UTC m=+1196.636573004" watchObservedRunningTime="2026-03-20 13:50:17.054562286 +0000 UTC m=+1196.652494825" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.117758 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fb9d46f97-rdkvb" podStartSLOduration=4.446757397 podStartE2EDuration="28.117729604s" podCreationTimestamp="2026-03-20 13:49:49 +0000 UTC" firstStartedPulling="2026-03-20 13:49:50.170367898 +0000 UTC m=+1169.768300427" lastFinishedPulling="2026-03-20 13:50:13.841340105 +0000 UTC m=+1193.439272634" observedRunningTime="2026-03-20 13:50:17.069729166 +0000 UTC m=+1196.667661685" watchObservedRunningTime="2026-03-20 13:50:17.117729604 +0000 UTC m=+1196.715662143" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.979671 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.982092 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.986991 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.988111 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.011706 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090193 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.103576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.121495 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" exitCode=0 Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.122058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.197596 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.203385 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.208766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.208832 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.215269 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.218885 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.226302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.228273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.244382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.261065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerStarted","Data":"5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.265310 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.293302 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.325075 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" podStartSLOduration=17.046940819 podStartE2EDuration="18.325049162s" podCreationTimestamp="2026-03-20 13:50:00 +0000 UTC" firstStartedPulling="2026-03-20 13:50:15.681373042 +0000 UTC m=+1195.279305571" lastFinishedPulling="2026-03-20 13:50:16.959481385 +0000 UTC m=+1196.557413914" observedRunningTime="2026-03-20 13:50:18.286824088 +0000 UTC m=+1197.884756607" watchObservedRunningTime="2026-03-20 13:50:18.325049162 +0000 UTC m=+1197.922981691" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.341986 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.342055 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.342387 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.360920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.384580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.390112 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7cf99699dd-lg99t" podStartSLOduration=3.390074389 podStartE2EDuration="3.390074389s" podCreationTimestamp="2026-03-20 13:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:18.380625285 +0000 UTC m=+1197.978557814" watchObservedRunningTime="2026-03-20 13:50:18.390074389 +0000 UTC m=+1197.988006918" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.288832 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.390088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"d570686fe60e350337cd58181076d1e8f618d5307ff29d77301f5c839ae0e2dc"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.411406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.421445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerStarted","Data":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.422385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.424640 4755 generic.go:334] "Generic (PLEG): container finished" podID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerID="5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418" exitCode=0 Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.424700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerDied","Data":"5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.428325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.450455 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" podStartSLOduration=4.450436849 podStartE2EDuration="4.450436849s" podCreationTimestamp="2026-03-20 13:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:19.4419312 +0000 UTC m=+1199.039863729" watchObservedRunningTime="2026-03-20 13:50:19.450436849 +0000 UTC m=+1199.048369378" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.464021 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.483360 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.483335147 podStartE2EDuration="5.483335147s" podCreationTimestamp="2026-03-20 13:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:19.466421092 +0000 UTC m=+1199.064353621" watchObservedRunningTime="2026-03-20 13:50:19.483335147 +0000 UTC m=+1199.081267676" Mar 20 13:50:20 crc kubenswrapper[4755]: I0320 13:50:20.441501 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e"} Mar 20 13:50:20 crc kubenswrapper[4755]: I0320 13:50:20.496778 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.496757119 podStartE2EDuration="6.496757119s" podCreationTimestamp="2026-03-20 13:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:20.4630513 +0000 UTC m=+1200.060983849" watchObservedRunningTime="2026-03-20 13:50:20.496757119 +0000 UTC m=+1200.094689648" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.035884 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.222260 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.240887 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds" (OuterVolumeSpecName: "kube-api-access-zrdds") pod "f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" (UID: "f74e82d0-07c7-4a72-baa4-9ec1e8427b5f"). InnerVolumeSpecName "kube-api-access-zrdds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.325319 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.347209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.357335 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.452929 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.453756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerDied","Data":"7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2"} Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.453799 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.458850 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105"} Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.459344 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.483514 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68899c9585-6xzdq" podStartSLOduration=4.483495271 podStartE2EDuration="4.483495271s" podCreationTimestamp="2026-03-20 13:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:21.479647362 +0000 UTC m=+1201.077579911" watchObservedRunningTime="2026-03-20 13:50:21.483495271 +0000 UTC m=+1201.081427820" Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.236401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" path="/var/lib/kubelet/pods/719824b6-7bd2-41dc-a61f-039b161a94d6/volumes" Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.497386 4755 generic.go:334] "Generic (PLEG): container finished" podID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerID="df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9" exitCode=0 Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.497452 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerDied","Data":"df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9"} Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.749846 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.750593 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.870823 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.870876 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.343055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.343160 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.384378 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.392803 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.489376 4755 scope.go:117] "RemoveContainer" containerID="9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.518030 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.518101 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.573919 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.574006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.620691 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.629769 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.749633 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.824077 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.824276 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" containerID="cri-o://70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" gracePeriod=10 Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.300422 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432824 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.433204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.439668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.439872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.443182 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8" (OuterVolumeSpecName: "kube-api-access-xssz8") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "kube-api-access-xssz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.445803 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.447055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.464867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.486362 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts" (OuterVolumeSpecName: "scripts") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.490139 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data" (OuterVolumeSpecName: "config-data") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.506622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547074 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547101 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547110 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547119 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547128 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerDied","Data":"a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838"} Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571217 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571283 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.587616 4755 generic.go:334] "Generic (PLEG): container finished" podID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerID="70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" exitCode=0 Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.588225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f"} Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.594201 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.594370 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.658143 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750431 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750545 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750584 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750916 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.761862 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57" (OuterVolumeSpecName: "kube-api-access-p7m57") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "kube-api-access-p7m57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.853118 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.995704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.003469 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.008893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config" (OuterVolumeSpecName: "config") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.009769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.012792 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060007 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060523 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060534 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060544 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060554 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.455538 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456059 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="init" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456080 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="init" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456093 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456100 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456125 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456135 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456171 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456368 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456382 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456395 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.457131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463501 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463623 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463913 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465503 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465516 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465762 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467463 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467621 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.468040 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.468080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.483824 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.570944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.580494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.588665 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.588878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.589263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.591307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.595387 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.597529 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.599404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"f4cea975e04082628cd5787018d084f226541b1327d170cee0f4b957229de5d6"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611641 4755 scope.go:117] "RemoveContainer" containerID="70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611861 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.618481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.637882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerStarted","Data":"88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.687463 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cxr9p" podStartSLOduration=3.367121269 podStartE2EDuration="42.687436767s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.169834139 +0000 UTC m=+1166.767766658" lastFinishedPulling="2026-03-20 13:50:26.490149627 +0000 UTC m=+1206.088082156" observedRunningTime="2026-03-20 13:50:27.675727445 +0000 UTC m=+1207.273659994" watchObservedRunningTime="2026-03-20 13:50:27.687436767 +0000 UTC m=+1207.285369296" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.712274 4755 scope.go:117] "RemoveContainer" containerID="e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.753381 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.782398 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.802790 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.574976 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.657962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8f554bbf4-zvxzv" event={"ID":"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23","Type":"ContainerStarted","Data":"585cb9060224708e15b0a8492d4c40abb252cdae21edc05d77427f0346747b94"} Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.658025 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.658051 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.211013 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.211474 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.249027 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" path="/var/lib/kubelet/pods/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e/volumes" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.696273 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8f554bbf4-zvxzv" event={"ID":"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23","Type":"ContainerStarted","Data":"3f3d1cdfea6af9d8a908a412c496899ae82961d54640b17676dd0e9e650416a4"} Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.696771 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.728735 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8f554bbf4-zvxzv" podStartSLOduration=2.728705751 podStartE2EDuration="2.728705751s" podCreationTimestamp="2026-03-20 13:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:29.717400259 +0000 UTC m=+1209.315332808" watchObservedRunningTime="2026-03-20 13:50:29.728705751 +0000 UTC m=+1209.326638280" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.108447 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.108537 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.710127 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.714794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerDied","Data":"88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f"} Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.713635 4755 generic.go:334] "Generic (PLEG): container finished" podID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerID="88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f" exitCode=0 Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.724584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerStarted","Data":"d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3"} Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.824328 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jrf8c" podStartSLOduration=4.016042495 podStartE2EDuration="45.824293549s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.355031793 +0000 UTC m=+1166.952964322" lastFinishedPulling="2026-03-20 13:50:29.163282847 +0000 UTC m=+1208.761215376" observedRunningTime="2026-03-20 13:50:30.806215164 +0000 UTC m=+1210.404147693" watchObservedRunningTime="2026-03-20 13:50:30.824293549 +0000 UTC m=+1210.422226078" Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.432741 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.735210 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerStarted","Data":"d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597"} Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.758118 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dtggj" podStartSLOduration=3.490697084 podStartE2EDuration="46.758100808s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.441602564 +0000 UTC m=+1167.039535093" lastFinishedPulling="2026-03-20 13:50:30.709006288 +0000 UTC m=+1210.306938817" observedRunningTime="2026-03-20 13:50:31.754058994 +0000 UTC m=+1211.351991523" watchObservedRunningTime="2026-03-20 13:50:31.758100808 +0000 UTC m=+1211.356033337" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.218813 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333371 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333491 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333789 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.336085 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs" (OuterVolumeSpecName: "logs") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.354954 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr" (OuterVolumeSpecName: "kube-api-access-m97kr") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "kube-api-access-m97kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.355400 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts" (OuterVolumeSpecName: "scripts") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.364151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data" (OuterVolumeSpecName: "config-data") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.366484 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438582 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438624 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438634 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438643 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438670 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerDied","Data":"7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662"} Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752693 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752746 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.957186 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:32 crc kubenswrapper[4755]: E0320 13:50:32.957529 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.957544 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.962055 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.963877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971457 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971715 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.972905 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ncc5q" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.973143 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.007326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051528 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051546 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154445 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.156270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.164102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.164404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.168357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.170347 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.175593 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.177676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.291202 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.810878 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.751451 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.780299 4755 generic.go:334] "Generic (PLEG): container finished" podID="95c76f8c-7b76-4714-adac-6297b84d6492" containerID="d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597" exitCode=0 Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.780407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerDied","Data":"d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597"} Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.872345 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9f7d4c74d-t7tpq" podUID="2af5836e-8c76-4432-95c0-ef34d6fc3528" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 13:50:36 crc kubenswrapper[4755]: I0320 13:50:36.802019 4755 generic.go:334] "Generic (PLEG): container finished" podID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerID="d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3" exitCode=0 Mar 20 13:50:36 crc kubenswrapper[4755]: I0320 13:50:36.802093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerDied","Data":"d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.743261 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerDied","Data":"39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855076 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855136 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.859751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"186b314629f910721f0b460f6b59d2c17b3ba67fcd83d6c40f5827c4304495b1"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926374 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.935742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7" (OuterVolumeSpecName: "kube-api-access-8zjs7") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "kube-api-access-8zjs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.971159 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.975488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030246 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030279 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030290 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.876727 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerDied","Data":"88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9"} Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.877211 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.074954 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.132730 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.133412 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.133559 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133634 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133964 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.134069 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.135149 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.144103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.149180 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.170556 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nvndn" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.186696 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.188867 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189716 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189851 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189942 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.190060 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.193522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.200318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.230590 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg" (OuterVolumeSpecName: "kube-api-access-fl4vg") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "kube-api-access-fl4vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.230720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231681 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231795 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231980 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232020 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232140 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232286 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232301 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232312 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.257402 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts" (OuterVolumeSpecName: "scripts") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.282484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.388552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.436193 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data" (OuterVolumeSpecName: "config-data") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480258 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480505 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480611 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480626 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480640 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.489067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.502968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.510879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.514000 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.514035 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.515538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.537820 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.540471 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.548752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.571057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585550 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585616 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585647 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585685 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585707 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585773 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585824 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.625433 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.627472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.630323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.637063 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688633 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688710 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688980 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.690940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.691962 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.692302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.692947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.693104 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.693327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.695741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.696024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.698305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.715171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.717018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.726245 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.772420 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791519 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.881929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.891338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.891627 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" containerID="cri-o://924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892456 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892868 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" containerID="cri-o://a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892974 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" containerID="cri-o://6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894366 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.896587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.906684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.907464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.918031 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.918091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"334c75e201ebd221a910b6f0d02653ea2f7672aff2f9ca41d27e206c9b0110f1"} Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.919821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.946583 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.947339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.963605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.390134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.454953 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.489224 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.562436 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.564581 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587117 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587551 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t52g6" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587784 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.610531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.647040 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.712865 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.714942 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.736713 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738333 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.754127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839915 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840051 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840075 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840148 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840224 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.845280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.849857 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.851997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.875438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.877730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.878360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.882541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.882949 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.885897 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.886590 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.954009 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.955727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.957159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.985974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"076583b3b51e5d40037512ddc7d4c0826835eb73b1677d159b7eca71658140d7"} Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.998359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"2e8685ea7959701a68d3522cc3c0f0316197ee7a0d886022a43a11423a58119b"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.000542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001120 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001537 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.031062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.037922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.038202 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65884d74bb-n9mkw" podStartSLOduration=9.038186211 podStartE2EDuration="9.038186211s" podCreationTimestamp="2026-03-20 13:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:41.03232347 +0000 UTC m=+1220.630255999" watchObservedRunningTime="2026-03-20 13:50:41.038186211 +0000 UTC m=+1220.636118740" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.038856 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.042461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.042857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.050732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerStarted","Data":"c99441aa9ab711c343a4fada1ebb8fbe5bc8444dea8e49f8792ccb4797464786"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.066904 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"3eb9eb785e7b462445fda35b80ffeb9c678e258078c5e90a1b14a0e2b7293256"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.078396 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"95166cf5a1430ac49b682e214261fcd32c3c05c72a9b931d676bf64e207ffdda"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100242 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" exitCode=0 Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100299 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" exitCode=2 Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100373 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109160 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109567 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109647 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109683 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109758 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.214492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.215329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.220088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.225507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.226215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.245482 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.266877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.422292 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.552825 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.637961 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640964 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641028 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.649067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.649158 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.664130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x" (OuterVolumeSpecName: "kube-api-access-zxd7x") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "kube-api-access-zxd7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.691191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts" (OuterVolumeSpecName: "scripts") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755628 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755687 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755705 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755717 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.791616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.899812 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.970270 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.989774 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.997184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.023812 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data" (OuterVolumeSpecName: "config-data") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.094661 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.094706 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.112042 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.117866 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerStarted","Data":"0ac451aa4ed8d677d77946a6a4c4490aa16c5aad1720a8d22a9ecfc0acddbe6e"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.120879 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.121023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127083 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" exitCode=0 Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127242 4755 scope.go:117] "RemoveContainer" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127434 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.134103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"9bab4e710eb88089079a33e7c9e02e8a26667fa5d5cf668bd290bf2ec7796c32"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.144968 4755 generic.go:334] "Generic (PLEG): container finished" podID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerID="26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b" exitCode=0 Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.145974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerDied","Data":"26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.157867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podStartSLOduration=3.157839921 podStartE2EDuration="3.157839921s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:42.146199821 +0000 UTC m=+1221.744132350" watchObservedRunningTime="2026-03-20 13:50:42.157839921 +0000 UTC m=+1221.755772450" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.180255 4755 scope.go:117] "RemoveContainer" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.294138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.308927 4755 scope.go:117] "RemoveContainer" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.315837 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.326725 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327466 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327494 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327534 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327576 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327585 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327878 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327896 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327920 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.330356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.332592 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.333470 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.347926 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.398882 4755 scope.go:117] "RemoveContainer" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.399439 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": container with ID starting with a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9 not found: ID does not exist" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.399477 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} err="failed to get container status \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": rpc error: code = NotFound desc = could not find container \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": container with ID starting with a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9 not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.399507 4755 scope.go:117] "RemoveContainer" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.399983 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": container with ID starting with 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 not found: ID does not exist" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400068 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} err="failed to get container status \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": rpc error: code = NotFound desc = could not find container \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": container with ID starting with 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400112 4755 scope.go:117] "RemoveContainer" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.400696 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": container with ID starting with 924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f not found: ID does not exist" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400729 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} err="failed to get container status \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": rpc error: code = NotFound desc = could not find container \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": container with ID starting with 924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410416 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410469 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.512620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.514518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.515299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.519622 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.521569 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.533348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.533821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.543432 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.643140 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.660059 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.728605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729552 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.734221 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg" (OuterVolumeSpecName: "kube-api-access-s2mdg") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "kube-api-access-s2mdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.759678 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.764374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.766607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.792801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config" (OuterVolumeSpecName: "config") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833595 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833638 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833673 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833684 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833693 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.867734 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.935711 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165694 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerDied","Data":"c99441aa9ab711c343a4fada1ebb8fbe5bc8444dea8e49f8792ccb4797464786"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165752 4755 scope.go:117] "RemoveContainer" containerID="26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165952 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.177027 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerID="74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b" exitCode=0 Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.177201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.186556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"a07bbef33f9bf32ac084e64d0d3ad49faa7ce5c4a2cf17fd44091e1ae8ffdb19"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.192842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.192883 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.278999 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" path="/var/lib/kubelet/pods/ded8942b-87a3-49fa-80fb-dc830c09f18d/volumes" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.281103 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.289212 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.364896 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.595672 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:44 crc kubenswrapper[4755]: I0320 13:50:44.214705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.279068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" path="/var/lib/kubelet/pods/f9333e7f-e263-450e-8a0e-0e788a57fd6d/volumes" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350673 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"63a07ea9507732987fa76339a9da53fdf9074739ca064b32427bb55875628827"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerStarted","Data":"fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"5f182f3057b3a41f85d8264583c773d7a77cb3fa05eac8eaf7a619fde59214e4"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.352025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"fe0020f726940f7dec3507ad4c9a928afdb07065e96f86a88720feb42e86a8e0"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.447699 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-656mk" podStartSLOduration=5.447683306 podStartE2EDuration="5.447683306s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:45.417961 +0000 UTC m=+1225.015893539" watchObservedRunningTime="2026-03-20 13:50:45.447683306 +0000 UTC m=+1225.045615835" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.828766 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.201451 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.202173 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" containerID="cri-o://2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.202754 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" containerID="cri-o://b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243200 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:46 crc kubenswrapper[4755]: E0320 13:50:46.243682 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243696 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243909 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.253680 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.264853 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.265129 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.279075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.289368 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:40692->10.217.0.159:9696: read: connection reset by peer" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.294985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.297392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.321981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401561 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401667 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401713 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401970 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402160 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402257 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402315 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402359 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424312 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" containerID="cri-o://57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424925 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" containerID="cri-o://7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.462244 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.462206085 podStartE2EDuration="6.462206085s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:46.449241091 +0000 UTC m=+1226.047173610" watchObservedRunningTime="2026-03-20 13:50:46.462206085 +0000 UTC m=+1226.060138614" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.472934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"c18f9b9b717a7727cb85e80d5965524cb055402bc4a7cd8d667cf040706879ee"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.502253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"6f77d1934c6dad8258402daa2e37c2a1f5cc3ef9b1ec6c01ec3c5558baf3c451"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504439 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504544 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.519767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.534341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.553315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.553950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.554211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.555845 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" podStartSLOduration=3.7609589359999998 podStartE2EDuration="7.555818668s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.784817431 +0000 UTC m=+1220.382749950" lastFinishedPulling="2026-03-20 13:50:44.579677163 +0000 UTC m=+1224.177609682" observedRunningTime="2026-03-20 13:50:46.55436726 +0000 UTC m=+1226.152299789" watchObservedRunningTime="2026-03-20 13:50:46.555818668 +0000 UTC m=+1226.153751197" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.557086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.562767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.564124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.564323 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.578643 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.579565 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.580151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.586144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.603260 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.641855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.658895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.679542 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.690728 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerID="44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" exitCode=137 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.691398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerDied","Data":"44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.884214 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.897098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929291 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929443 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.930469 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs" (OuterVolumeSpecName: "logs") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.938388 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n" (OuterVolumeSpecName: "kube-api-access-mv45n") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "kube-api-access-mv45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.961906 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.963953 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56b9dc5449-j62ns" podStartSLOduration=3.871998037 podStartE2EDuration="7.963927216s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.488938185 +0000 UTC m=+1220.086870714" lastFinishedPulling="2026-03-20 13:50:44.580867364 +0000 UTC m=+1224.178799893" observedRunningTime="2026-03-20 13:50:46.604231256 +0000 UTC m=+1226.202163775" watchObservedRunningTime="2026-03-20 13:50:46.963927216 +0000 UTC m=+1226.561859745" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034302 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034733 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034743 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.068444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts" (OuterVolumeSpecName: "scripts") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.110406 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data" (OuterVolumeSpecName: "config-data") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.137869 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.138262 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154086 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-conmon-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-conmon-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154136 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154158 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-conmon-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-conmon-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154175 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.155148 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.160027 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.167827 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3.scope WatchSource:0}: Error finding container 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3: Status 404 returned error can't find the container with id 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.201229 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9333e7f_e263_450e_8a0e_0e788a57fd6d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9333e7f_e263_450e_8a0e_0e788a57fd6d.slice: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718428 4755 generic.go:334] "Generic (PLEG): container finished" podID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerID="4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" exitCode=137 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718898 4755 generic.go:334] "Generic (PLEG): container finished" podID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerID="be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" exitCode=137 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.720924 4755 generic.go:334] "Generic (PLEG): container finished" podID="58518373-7b53-4ecc-bc83-3982b7688219" containerID="7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" exitCode=0 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.720967 4755 generic.go:334] "Generic (PLEG): container finished" podID="58518373-7b53-4ecc-bc83-3982b7688219" containerID="57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" exitCode=143 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.721021 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.721056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.722603 4755 generic.go:334] "Generic (PLEG): container finished" podID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerID="b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" exitCode=0 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.722675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.727827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.748028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.764005 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerDied","Data":"935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789791 4755 scope.go:117] "RemoveContainer" containerID="44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789983 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.816393 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.199341486 podStartE2EDuration="7.816353268s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="2026-03-20 13:50:41.965567815 +0000 UTC m=+1221.563500344" lastFinishedPulling="2026-03-20 13:50:44.582579597 +0000 UTC m=+1224.180512126" observedRunningTime="2026-03-20 13:50:47.76909018 +0000 UTC m=+1227.367022709" watchObservedRunningTime="2026-03-20 13:50:47.816353268 +0000 UTC m=+1227.414285787" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.904718 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.912408 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:50:48 crc kubenswrapper[4755]: E0320 13:50:48.138190 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-conmon-4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-conmon-44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.144938 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.163606 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.269430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.364894 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.389929 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.489889 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.489966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490085 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.497218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs" (OuterVolumeSpecName: "logs") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.497710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.503134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh" (OuterVolumeSpecName: "kube-api-access-rn2fh") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "kube-api-access-rn2fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.517314 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts" (OuterVolumeSpecName: "scripts") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.518452 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data" (OuterVolumeSpecName: "config-data") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.534530 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594145 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594861 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594980 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.595047 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.595100 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711455 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711801 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.712232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs" (OuterVolumeSpecName: "logs") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.712802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.713012 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.718880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.723675 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts" (OuterVolumeSpecName: "scripts") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.730945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd" (OuterVolumeSpecName: "kube-api-access-rdsdd") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "kube-api-access-rdsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.753525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815667 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815701 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815713 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815722 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815732 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.824710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data" (OuterVolumeSpecName: "config-data") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828710 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828781 4755 scope.go:117] "RemoveContainer" containerID="4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.836273 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.837105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"a07bbef33f9bf32ac084e64d0d3ad49faa7ce5c4a2cf17fd44091e1ae8ffdb19"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.848453 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"510c291a7c70e7aad56901b2d2d28fa193f11c28bd50136a3b75868deba06c96"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.848504 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"f50d00ada2da3bbe1664473ead894d1b915667651d38013543c1f0dedb6ccc75"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.862832 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"bf095427e3eb80033429a5e7a833eaecddcfcfa1eb35f4693e34708cbf4e8b51"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.862896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"ebfb7d279d9d32922e2526035dc4a22dadb837238236158cb6c10a364f8c1547"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.867049 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.930919 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.984029 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.016961 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.052962 4755 scope.go:117] "RemoveContainer" containerID="be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.072361 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074001 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074020 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074030 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074036 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074084 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074094 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074113 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074474 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074511 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074520 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074788 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074804 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074843 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074853 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074865 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.076378 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082006 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082356 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082562 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.128785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.166734 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.175779 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.208560 4755 scope.go:117] "RemoveContainer" containerID="7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240035 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240268 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240359 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240454 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.264905 4755 scope.go:117] "RemoveContainer" containerID="57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.319978 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" path="/var/lib/kubelet/pods/2f75cbbe-c852-4090-aca4-42cd87a3a9b3/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.320723 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58518373-7b53-4ecc-bc83-3982b7688219" path="/var/lib/kubelet/pods/58518373-7b53-4ecc-bc83-3982b7688219/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.321850 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" path="/var/lib/kubelet/pods/cba56df0-ceeb-40c0-b1b0-15bb4d548b80/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345516 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345552 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345679 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.348024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.351421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.353541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.357054 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.360738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.367205 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.373188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.378199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.385674 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.477225 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.897271 4755 generic.go:334] "Generic (PLEG): container finished" podID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerID="2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" exitCode=0 Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.897757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.907786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"3c5180748278247e8682e0b8caf44998c1cfcc0ac7e8999b2bea12bcc98ec95c"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.909602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.944866 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"57a1b5bf9d00fce8765d2aec48b5c36112e1e206c4ea6824c6387bd16ca8cafd"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.946540 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.946583 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.949200 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-754b98cbff-jgntp" podStartSLOduration=3.949183832 podStartE2EDuration="3.949183832s" podCreationTimestamp="2026-03-20 13:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:49.937251474 +0000 UTC m=+1229.535184013" watchObservedRunningTime="2026-03-20 13:50:49.949183832 +0000 UTC m=+1229.547116361" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.056527 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7769db74db-f4kfh" podStartSLOduration=4.056509187 podStartE2EDuration="4.056509187s" podCreationTimestamp="2026-03-20 13:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:49.969031093 +0000 UTC m=+1229.566963622" watchObservedRunningTime="2026-03-20 13:50:50.056509187 +0000 UTC m=+1229.654441716" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.059153 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.221885 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.378142 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473040 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473128 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.490356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67" (OuterVolumeSpecName: "kube-api-access-zwv67") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "kube-api-access-zwv67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.505874 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.539959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576639 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576694 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576704 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.616444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.619772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.643764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config" (OuterVolumeSpecName: "config") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.664758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.672480 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680098 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680156 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680172 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680187 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.768554 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"d570686fe60e350337cd58181076d1e8f618d5307ff29d77301f5c839ae0e2dc"} Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957140 4755 scope.go:117] "RemoveContainer" containerID="b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957149 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.959375 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.961406 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" containerID="cri-o://50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" gracePeriod=30 Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.961626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"39efe372539eba9227bd4e28c3450c18b0f33b1a15d13f52fba8d4ef0c2c0e9c"} Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.962634 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" containerID="cri-o://defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" gracePeriod=30 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.029745 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.038104 4755 scope.go:117] "RemoveContainer" containerID="2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.050689 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.111517 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.198464 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.198735 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" containerID="cri-o://54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" gracePeriod=10 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.289914 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" path="/var/lib/kubelet/pods/a4c0d88b-a127-41a4-824c-e09a285a5a62/volumes" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.290634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.486456 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.671391 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709679 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709770 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709842 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709942 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709998 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.717431 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr" (OuterVolumeSpecName: "kube-api-access-kq9xr") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "kube-api-access-kq9xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.761489 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.764955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.770129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.796315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config" (OuterVolumeSpecName: "config") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812624 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812689 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812703 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812715 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.816444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.823061 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.913889 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.913916 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.979565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.980699 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981844 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" exitCode=0 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981902 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981924 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"973fbf181b3cd53cfc9d88b983db4e84c26d3e70cf15d7f503f2e3da897707a2"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981941 4755 scope.go:117] "RemoveContainer" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.982082 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.985974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"fc400527a488146e60c86d906397fc6d294b199a76c49bdc269ea31150823810"} Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.003814 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.244181571 podStartE2EDuration="10.003787589s" podCreationTimestamp="2026-03-20 13:50:42 +0000 UTC" firstStartedPulling="2026-03-20 13:50:44.469874013 +0000 UTC m=+1224.067806542" lastFinishedPulling="2026-03-20 13:50:51.229480031 +0000 UTC m=+1230.827412560" observedRunningTime="2026-03-20 13:50:52.00345698 +0000 UTC m=+1231.601389509" watchObservedRunningTime="2026-03-20 13:50:52.003787589 +0000 UTC m=+1231.601720118" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.062111 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.068817 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.075225 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.085639 4755 scope.go:117] "RemoveContainer" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.129874 4755 scope.go:117] "RemoveContainer" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:52 crc kubenswrapper[4755]: E0320 13:50:52.131966 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": container with ID starting with 54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c not found: ID does not exist" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132002 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} err="failed to get container status \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": rpc error: code = NotFound desc = could not find container \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": container with ID starting with 54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c not found: ID does not exist" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132029 4755 scope.go:117] "RemoveContainer" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: E0320 13:50:52.132233 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": container with ID starting with 418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89 not found: ID does not exist" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89"} err="failed to get container status \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": rpc error: code = NotFound desc = could not find container \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": container with ID starting with 418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89 not found: ID does not exist" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.024809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"de5eff2016acb7db7708d693cc4fee0449e99b199af2bf67fb6642781cb1cebd"} Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025012 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" containerID="cri-o://4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" gracePeriod=30 Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025721 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025779 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" containerID="cri-o://7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" gracePeriod=30 Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.070813 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.0707903 podStartE2EDuration="5.0707903s" podCreationTimestamp="2026-03-20 13:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:53.05409106 +0000 UTC m=+1232.652023599" watchObservedRunningTime="2026-03-20 13:50:53.0707903 +0000 UTC m=+1232.668722829" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.263686 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" path="/var/lib/kubelet/pods/ee532ae9-f63a-4f8c-82db-3d81014a6e05/volumes" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.030676 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" exitCode=0 Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.030691 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.749350 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.878292 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992766 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992910 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992931 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.993055 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.993892 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.000545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9" (OuterVolumeSpecName: "kube-api-access-9xpt9") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "kube-api-access-9xpt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.002993 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts" (OuterVolumeSpecName: "scripts") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.028825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.087912 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"9bab4e710eb88089079a33e7c9e02e8a26667fa5d5cf668bd290bf2ec7796c32"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088056 4755 scope.go:117] "RemoveContainer" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088236 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098096 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098126 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098137 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098149 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.140889 4755 generic.go:334] "Generic (PLEG): container finished" podID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.143099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.166995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.201423 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.253642 4755 scope.go:117] "RemoveContainer" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.286097 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data" (OuterVolumeSpecName: "config-data") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.296928 4755 scope.go:117] "RemoveContainer" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.300768 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": container with ID starting with 7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860 not found: ID does not exist" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.300931 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} err="failed to get container status \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": rpc error: code = NotFound desc = could not find container \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": container with ID starting with 7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.301011 4755 scope.go:117] "RemoveContainer" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.302946 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.303142 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": container with ID starting with 4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b not found: ID does not exist" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.303232 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} err="failed to get container status \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": rpc error: code = NotFound desc = could not find container \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": container with ID starting with 4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.418806 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.432441 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.447907 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448253 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="init" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448269 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="init" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448282 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448288 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448300 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448306 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448326 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448332 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448344 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448351 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448362 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448368 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448537 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448549 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448564 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448578 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.449438 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.453960 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.462460 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.506787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.610645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.611272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.612144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.612335 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.615455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.616011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.616899 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.615517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.630233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.779509 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:56 crc kubenswrapper[4755]: I0320 13:50:56.406767 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.192767 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"29dab42a4ce34544dcbff3f9d7bf78f6406ef710c481cc96b3250981650aebc9"} Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.193538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"8ffc3caa8314737810ddb54f5d0eeeaaceaecf61b2e3c37e7d1af0e39ccd4b1d"} Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.253356 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" path="/var/lib/kubelet/pods/b9649141-1c8e-4387-8cfc-81d60abf76f3/volumes" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.204717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"5dee25b7c414f01aba2ffc1c56e8fcb3fd3f862b3df29766bfd1850c0e50dc27"} Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.237137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.237117713 podStartE2EDuration="3.237117713s" podCreationTimestamp="2026-03-20 13:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:58.23429258 +0000 UTC m=+1237.832225109" watchObservedRunningTime="2026-03-20 13:50:58.237117713 +0000 UTC m=+1237.835050242" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.475942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.850335 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960222 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960780 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" containerID="cri-o://901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" gracePeriod=30 Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960973 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" containerID="cri-o://58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" gracePeriod=30 Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.250998 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" exitCode=143 Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.252131 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.731322 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:51:00 crc kubenswrapper[4755]: I0320 13:51:00.780590 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.014851 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.016702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020444 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nvpnw" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020997 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.037537 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040224 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142264 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.143496 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.164522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.167368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.167813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.340904 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.435824 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.896837 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:02 crc kubenswrapper[4755]: I0320 13:51:02.280920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"96136572-ead6-4771-bd36-eec29b5fb137","Type":"ContainerStarted","Data":"ffcf9257ec195bcf122894453f661953d7455e45a95cd3f0c29abf85530ccf53"} Mar 20 13:51:03 crc kubenswrapper[4755]: I0320 13:51:03.767054 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:35668->10.217.0.165:9311: read: connection reset by peer" Mar 20 13:51:03 crc kubenswrapper[4755]: I0320 13:51:03.768044 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:35658->10.217.0.165:9311: read: connection reset by peer" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.285882 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.318074 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.318105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.319586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs" (OuterVolumeSpecName: "logs") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.326630 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9" (OuterVolumeSpecName: "kube-api-access-52gh9") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "kube-api-access-52gh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336134 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" exitCode=0 Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"95166cf5a1430ac49b682e214261fcd32c3c05c72a9b931d676bf64e207ffdda"} Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336231 4755 scope.go:117] "RemoveContainer" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336434 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.348779 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.373841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.414748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data" (OuterVolumeSpecName: "config-data") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.419996 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420032 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420042 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420052 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420062 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.440986 4755 scope.go:117] "RemoveContainer" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.472225 4755 scope.go:117] "RemoveContainer" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: E0320 13:51:04.473233 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": container with ID starting with 58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f not found: ID does not exist" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473270 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} err="failed to get container status \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": rpc error: code = NotFound desc = could not find container \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": container with ID starting with 58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f not found: ID does not exist" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473299 4755 scope.go:117] "RemoveContainer" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: E0320 13:51:04.473823 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": container with ID starting with 901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4 not found: ID does not exist" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473873 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} err="failed to get container status \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": rpc error: code = NotFound desc = could not find container \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": container with ID starting with 901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4 not found: ID does not exist" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.681050 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.691149 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.749173 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.944261 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.996267 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:51:05 crc kubenswrapper[4755]: I0320 13:51:05.237985 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" path="/var/lib/kubelet/pods/d4952a5b-9ca7-4ae1-bcd6-0598511fb809/volumes" Mar 20 13:51:06 crc kubenswrapper[4755]: I0320 13:51:06.415366 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.636188 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:07 crc kubenswrapper[4755]: E0320 13:51:07.636959 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.636973 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: E0320 13:51:07.636994 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637001 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637197 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637211 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.638139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.641083 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.641358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.647367 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.673806 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.697870 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.697978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698770 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698966 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801875 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.802175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.808825 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.809565 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.809762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.812333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.819332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.828139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.966303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258105 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258540 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" containerID="cri-o://c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258700 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" containerID="cri-o://6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258860 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" containerID="cri-o://1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258953 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" containerID="cri-o://14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.292567 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.400820 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401190 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" exitCode=2 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401205 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401215 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.400978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401260 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f"} Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.387098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.387626 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" containerID="cri-o://13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" gracePeriod=30 Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.388118 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" containerID="cri-o://44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" gracePeriod=30 Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.661909 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": dial tcp 10.217.0.169:3000: connect: connection refused" Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353318 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353771 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" containerID="cri-o://7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" gracePeriod=30 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353887 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" containerID="cri-o://22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" gracePeriod=30 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.457573 4755 generic.go:334] "Generic (PLEG): container finished" podID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" exitCode=143 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.457624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.471618 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" exitCode=143 Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.471987 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.750860 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.751011 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.222048 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277639 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277728 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277771 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.280504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.280632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.282883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts" (OuterVolumeSpecName: "scripts") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.283387 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f" (OuterVolumeSpecName: "kube-api-access-vk59f") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "kube-api-access-vk59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.305277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.348964 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.367701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data" (OuterVolumeSpecName: "config-data") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380823 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380871 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380907 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380918 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380927 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380936 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380947 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484389 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"63a07ea9507732987fa76339a9da53fdf9074739ca064b32427bb55875628827"} Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484435 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484468 4755 scope.go:117] "RemoveContainer" containerID="14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.489994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"96136572-ead6-4771-bd36-eec29b5fb137","Type":"ContainerStarted","Data":"8093f8dcdf9807804a717a6ff0bd78208a3f8da65f0e0452029772bc9aba27d1"} Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.501395 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.506841 4755 scope.go:117] "RemoveContainer" containerID="6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.538910 4755 scope.go:117] "RemoveContainer" containerID="1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.549331 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.455528611 podStartE2EDuration="15.549309822s" podCreationTimestamp="2026-03-20 13:51:00 +0000 UTC" firstStartedPulling="2026-03-20 13:51:01.901499892 +0000 UTC m=+1241.499432421" lastFinishedPulling="2026-03-20 13:51:14.995281103 +0000 UTC m=+1254.593213632" observedRunningTime="2026-03-20 13:51:15.525031277 +0000 UTC m=+1255.122963806" watchObservedRunningTime="2026-03-20 13:51:15.549309822 +0000 UTC m=+1255.147242361" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.562198 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.566450 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:59152->10.217.0.156:9292: read: connection reset by peer" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.566706 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:59154->10.217.0.156:9292: read: connection reset by peer" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.580842 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.592059 4755 scope.go:117] "RemoveContainer" containerID="c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594057 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594522 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594557 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594565 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594588 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594594 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594615 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594622 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594846 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594878 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594890 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594899 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.596663 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.602011 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.610338 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.610670 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687548 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687615 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.688068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.688312 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790379 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790403 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.791804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.798117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.805318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.805964 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.806086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.806142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.815454 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.929103 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.105770 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199559 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199950 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200093 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.205306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs" (OuterVolumeSpecName: "logs") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.211416 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.215816 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts" (OuterVolumeSpecName: "scripts") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.218016 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw" (OuterVolumeSpecName: "kube-api-access-l4bvw") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "kube-api-access-l4bvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.228872 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.274855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303360 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303399 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303409 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303435 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303445 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303455 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.355344 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.358837 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.378444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data" (OuterVolumeSpecName: "config-data") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413537 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413602 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413620 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.520573 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:46726->10.217.0.155:9292: read: connection reset by peer" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.520993 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:46736->10.217.0.155:9292: read: connection reset by peer" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545002 4755 generic.go:334] "Generic (PLEG): container finished" podID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" exitCode=0 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545124 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"2a8c9089e1e7047efefde5df6ddda04cf4071fad9b8adaef75c6dc167f6e724d"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545196 4755 scope.go:117] "RemoveContainer" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545327 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.570857 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"513c4483cc521844fc779ad4ad8fea88a8f59c0d75bd000160932fcd3bb65cd0"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.570928 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"492850182a806432985c0a76db5d357fdd1f75f64e462ac1c324547eddbc580a"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.660725 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.718632 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.723814 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.727572 4755 scope.go:117] "RemoveContainer" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.784704 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.811285 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.811679 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.820559 4755 scope.go:117] "RemoveContainer" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.811692 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.822070 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822091 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.822559 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": container with ID starting with 44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7 not found: ID does not exist" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822592 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822596 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} err="failed to get container status \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": rpc error: code = NotFound desc = could not find container \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": container with ID starting with 44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7 not found: ID does not exist" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822624 4755 scope.go:117] "RemoveContainer" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822636 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.823634 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.823692 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": container with ID starting with 13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b not found: ID does not exist" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.823711 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} err="failed to get container status \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": rpc error: code = NotFound desc = could not find container \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": container with ID starting with 13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b not found: ID does not exist" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.827583 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.830370 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.836032 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.893752 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.894173 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cf99699dd-lg99t" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" containerID="cri-o://7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" gracePeriod=30 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.894854 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cf99699dd-lg99t" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" containerID="cri-o://cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" gracePeriod=30 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956444 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059247 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.067152 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.070947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.073435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.091027 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.093318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.100485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.147285 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.257553 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" path="/var/lib/kubelet/pods/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe/volumes" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.258744 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" path="/var/lib/kubelet/pods/d489e08f-1107-45f2-b1d0-c9b786974ee4/volumes" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.536992 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589611 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589715 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589740 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.592120 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.596240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs" (OuterVolumeSpecName: "logs") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.607144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r" (OuterVolumeSpecName: "kube-api-access-2ng4r") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "kube-api-access-2ng4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.609189 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.613541 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts" (OuterVolumeSpecName: "scripts") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.615041 4755 generic.go:334] "Generic (PLEG): container finished" podID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerID="cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" exitCode=0 Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.615142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.643865 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" exitCode=0 Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.643961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"e2bc4a76cdde92278100366b804ccffd9de5944afb2aabb9539badf236736c8a"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644023 4755 scope.go:117] "RemoveContainer" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644187 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703066 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"063ac1b6b9806b98d914b194e9bde9a87a0a8f7cc8c074cb5f40a75a8fe6d0e5"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703083 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703157 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703172 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703187 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703217 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.712950 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.713290 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.721825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"e82f1532e1a1c38107bb859460f4520da9baccf34fa7c549aea69d074c192f66"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.759950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.784798 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.786032 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-847679bbfc-l8kwj" podStartSLOduration=10.786002963 podStartE2EDuration="10.786002963s" podCreationTimestamp="2026-03-20 13:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:17.77307318 +0000 UTC m=+1257.371005709" watchObservedRunningTime="2026-03-20 13:51:17.786002963 +0000 UTC m=+1257.383935492" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.809984 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.810027 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.814550 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.859248 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data" (OuterVolumeSpecName: "config-data") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.912787 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.912832 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.933818 4755 scope.go:117] "RemoveContainer" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.997990 4755 scope.go:117] "RemoveContainer" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.000958 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": container with ID starting with 22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3 not found: ID does not exist" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.000988 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} err="failed to get container status \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": rpc error: code = NotFound desc = could not find container \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": container with ID starting with 22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3 not found: ID does not exist" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.001008 4755 scope.go:117] "RemoveContainer" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.001243 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": container with ID starting with 7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589 not found: ID does not exist" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.001260 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} err="failed to get container status \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": rpc error: code = NotFound desc = could not find container \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": container with ID starting with 7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589 not found: ID does not exist" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.002599 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.024911 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.039946 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.040409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040422 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.040452 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040458 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040610 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040634 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.041490 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.047094 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.047248 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.072536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.104261 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120947 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121067 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121304 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: W0320 13:51:18.139099 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode65d1645_8a19_459e_ac89_b485f27e2841.slice/crio-c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf WatchSource:0}: Error finding container c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf: Status 404 returned error can't find the container with id c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.224288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.224790 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.225037 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.233397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234456 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.255372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.273871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.365000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.499850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.825798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.841739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.187531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:19 crc kubenswrapper[4755]: W0320 13:51:19.198089 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b182ae3_20c9_48af_9313_d48a608924b1.slice/crio-b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303 WatchSource:0}: Error finding container b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303: Status 404 returned error can't find the container with id b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303 Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.237448 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" path="/var/lib/kubelet/pods/74e04f8c-57a9-4c29-b9ae-5fea257f36da/volumes" Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.874628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.875670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.900141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"0687b0972ba88ee0ba6ba105a270c9eafaab3c519d6923105497f9948a2f55ec"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.927548 4755 generic.go:334] "Generic (PLEG): container finished" podID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerID="7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" exitCode=0 Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.927634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.929887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:19.999994 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159286 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159761 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.160211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.160450 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.185750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh" (OuterVolumeSpecName: "kube-api-access-grrbh") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "kube-api-access-grrbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.186835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.262330 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.262400 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.264563 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.277644 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config" (OuterVolumeSpecName: "config") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.317735 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364371 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364404 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364414 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"c78877e8c0818f15b0f2e1b6adc8bc55f8f067e988d6b39df713c8a50ee71484"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941735 4755 scope.go:117] "RemoveContainer" containerID="cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941473 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.945419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"651a569f8e307be4d0515c4e75ec2a428af80a6c718afc1987e63aae8dae8f60"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.945469 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"2fbb124138d8dd562d0f2c1fd811d9014a8cae04e612041f70cd59d89cbe840e"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.951341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"16bfd23b519b858985c83d29ed9143cc5646aede3b90145aebf288ed681971db"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.966259 4755 scope.go:117] "RemoveContainer" containerID="7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.974253 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.97423423 podStartE2EDuration="3.97423423s" podCreationTimestamp="2026-03-20 13:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:20.968735948 +0000 UTC m=+1260.566668477" watchObservedRunningTime="2026-03-20 13:51:20.97423423 +0000 UTC m=+1260.572166759" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.010368 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.024766 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.030056 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.030034318 podStartE2EDuration="5.030034318s" podCreationTimestamp="2026-03-20 13:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:21.01344623 +0000 UTC m=+1260.611378759" watchObservedRunningTime="2026-03-20 13:51:21.030034318 +0000 UTC m=+1260.627966847" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.251211 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" path="/var/lib/kubelet/pods/bb70d5b8-33a3-4299-bae5-d13d998e11a2/volumes" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.417688 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491713 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492717 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492779 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.493179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs" (OuterVolumeSpecName: "logs") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.493408 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.505804 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.509249 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds" (OuterVolumeSpecName: "kube-api-access-525ds") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "kube-api-access-525ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.533403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data" (OuterVolumeSpecName: "config-data") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.538866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.559879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts" (OuterVolumeSpecName: "scripts") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.583081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595623 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595672 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595682 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595698 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595709 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595718 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962160 4755 generic.go:334] "Generic (PLEG): container finished" podID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" exitCode=137 Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"fd011c06f3fffccd2ebc454db1a10f42c4b31b9cc3cdee3a458a0730af40410b"} Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962339 4755 scope.go:117] "RemoveContainer" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.007862 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.014947 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.158406 4755 scope.go:117] "RemoveContainer" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.181561 4755 scope.go:117] "RemoveContainer" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.182604 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": container with ID starting with defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c not found: ID does not exist" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.182666 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} err="failed to get container status \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": rpc error: code = NotFound desc = could not find container \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": container with ID starting with defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c not found: ID does not exist" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.182698 4755 scope.go:117] "RemoveContainer" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.183139 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": container with ID starting with 50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8 not found: ID does not exist" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.183164 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} err="failed to get container status \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": rpc error: code = NotFound desc = could not find container \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": container with ID starting with 50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8 not found: ID does not exist" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.888678 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889131 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889151 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889170 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889179 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889198 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889206 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889220 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889229 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889461 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889492 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889505 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889516 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.890823 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.906579 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.921675 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.921794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.984940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985146 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" containerID="cri-o://3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985398 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985680 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" containerID="cri-o://c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985725 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" containerID="cri-o://d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985762 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" containerID="cri-o://028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" gracePeriod=30 Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.001743 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.003412 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.004618 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.026463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.026573 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027502 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.517847701 podStartE2EDuration="8.027483802s" podCreationTimestamp="2026-03-20 13:51:15 +0000 UTC" firstStartedPulling="2026-03-20 13:51:16.727566342 +0000 UTC m=+1256.325498861" lastFinishedPulling="2026-03-20 13:51:22.237202433 +0000 UTC m=+1261.835134962" observedRunningTime="2026-03-20 13:51:23.011211652 +0000 UTC m=+1262.609144181" watchObservedRunningTime="2026-03-20 13:51:23.027483802 +0000 UTC m=+1262.625416331" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027828 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.033535 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.074017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.138769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.139051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.211462 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.241255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.241378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.242185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.265941 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" path="/var/lib/kubelet/pods/12871c7a-ef63-447d-b1f6-27a5645dbc21/volumes" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.268103 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.270014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.273903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.283885 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.286053 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.293454 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.315310 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.329274 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.436377 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.438204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.442822 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.464172 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.465898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.465997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.466024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.466128 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.469486 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573556 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.575542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.576822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.601513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.630396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.658711 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.659098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.659973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.673025 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.674761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.674798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.675779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.709889 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.728241 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.759139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.789251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.789311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.799919 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.900706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.900772 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.902328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.931263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.932944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.018950 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.070926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerStarted","Data":"80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091931 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" exitCode=0 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091974 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" exitCode=2 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091983 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" exitCode=0 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093301 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.266473 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.478118 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.553876 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.705641 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:24 crc kubenswrapper[4755]: W0320 13:51:24.715563 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03accbff_bdf2_4256_bdf2_1b39d5485673.slice/crio-6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622 WatchSource:0}: Error finding container 6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622: Status 404 returned error can't find the container with id 6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.846494 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:24 crc kubenswrapper[4755]: W0320 13:51:24.929783 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39991203_9b8d_4985_8e90_b3d1772f6b8f.slice/crio-108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81 WatchSource:0}: Error finding container 108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81: Status 404 returned error can't find the container with id 108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114310 4755 generic.go:334] "Generic (PLEG): container finished" podID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerID="f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba" exitCode=0 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerDied","Data":"f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerStarted","Data":"658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.123930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerStarted","Data":"08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.123998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerStarted","Data":"6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.127585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerStarted","Data":"108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.136429 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerStarted","Data":"d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.136518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerStarted","Data":"b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.177997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerStarted","Data":"8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.178226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerStarted","Data":"a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.189089 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c99c-account-create-update-5s889" podStartSLOduration=2.189052167 podStartE2EDuration="2.189052167s" podCreationTimestamp="2026-03-20 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:25.165109579 +0000 UTC m=+1264.763042108" watchObservedRunningTime="2026-03-20 13:51:25.189052167 +0000 UTC m=+1264.786984686" Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.194063 4755 generic.go:334] "Generic (PLEG): container finished" podID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerID="34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27" exitCode=0 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.194741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerDied","Data":"34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.219917 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" podStartSLOduration=2.219896822 podStartE2EDuration="2.219896822s" podCreationTimestamp="2026-03-20 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:25.213141197 +0000 UTC m=+1264.811073716" watchObservedRunningTime="2026-03-20 13:51:25.219896822 +0000 UTC m=+1264.817829351" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.205582 4755 generic.go:334] "Generic (PLEG): container finished" podID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerID="08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.206037 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerDied","Data":"08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.209075 4755 generic.go:334] "Generic (PLEG): container finished" podID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerID="a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.209190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerDied","Data":"a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.211048 4755 generic.go:334] "Generic (PLEG): container finished" podID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerID="d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.211087 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerDied","Data":"d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.212606 4755 generic.go:334] "Generic (PLEG): container finished" podID="f395acec-f28b-4622-b349-127cf31ec92d" containerID="8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.212790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerDied","Data":"8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363"} Mar 20 13:51:26 crc kubenswrapper[4755]: E0320 13:51:26.342687 4755 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.588340 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.681796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"f395acec-f28b-4622-b349-127cf31ec92d\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.681857 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"f395acec-f28b-4622-b349-127cf31ec92d\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.682742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f395acec-f28b-4622-b349-127cf31ec92d" (UID: "f395acec-f28b-4622-b349-127cf31ec92d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.688898 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k" (OuterVolumeSpecName: "kube-api-access-rlf4k") pod "f395acec-f28b-4622-b349-127cf31ec92d" (UID: "f395acec-f28b-4622-b349-127cf31ec92d"). InnerVolumeSpecName "kube-api-access-rlf4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.784391 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.784791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.888438 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.900964 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.987556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.987691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.989456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0deb3f1a-0cad-4429-9e79-38e5a0b38896" (UID: "0deb3f1a-0cad-4429-9e79-38e5a0b38896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.003846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b" (OuterVolumeSpecName: "kube-api-access-7dw7b") pod "0deb3f1a-0cad-4429-9e79-38e5a0b38896" (UID: "0deb3f1a-0cad-4429-9e79-38e5a0b38896"). InnerVolumeSpecName "kube-api-access-7dw7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a5606c-c777-4c0b-951c-6ce2e03edd7e" (UID: "32a5606c-c777-4c0b-951c-6ce2e03edd7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090128 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090148 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090161 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.093844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4" (OuterVolumeSpecName: "kube-api-access-nwfb4") pod "32a5606c-c777-4c0b-951c-6ce2e03edd7e" (UID: "32a5606c-c777-4c0b-951c-6ce2e03edd7e"). InnerVolumeSpecName "kube-api-access-nwfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.150017 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.150272 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.191876 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.205687 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.210854 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.254981 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.255001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerDied","Data":"80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.255080 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265050 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerDied","Data":"a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265152 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270287 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerDied","Data":"658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270320 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270417 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.272511 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.272555 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.716869 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.801648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"39991203-9b8d-4985-8e90-b3d1772f6b8f\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.801839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"39991203-9b8d-4985-8e90-b3d1772f6b8f\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.802194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39991203-9b8d-4985-8e90-b3d1772f6b8f" (UID: "39991203-9b8d-4985-8e90-b3d1772f6b8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.802342 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.806882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7" (OuterVolumeSpecName: "kube-api-access-gzct7") pod "39991203-9b8d-4985-8e90-b3d1772f6b8f" (UID: "39991203-9b8d-4985-8e90-b3d1772f6b8f"). InnerVolumeSpecName "kube-api-access-gzct7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.847374 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.861241 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.903825 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008231 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"03accbff-bdf2-4256-bdf2-1b39d5485673\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008341 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"03accbff-bdf2-4256-bdf2-1b39d5485673\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008435 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" (UID: "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008761 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03accbff-bdf2-4256-bdf2-1b39d5485673" (UID: "03accbff-bdf2-4256-bdf2-1b39d5485673"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008831 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.012396 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6" (OuterVolumeSpecName: "kube-api-access-mhdb6") pod "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" (UID: "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86"). InnerVolumeSpecName "kube-api-access-mhdb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.013670 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n" (OuterVolumeSpecName: "kube-api-access-bqb6n") pod "03accbff-bdf2-4256-bdf2-1b39d5485673" (UID: "03accbff-bdf2-4256-bdf2-1b39d5485673"). InnerVolumeSpecName "kube-api-access-bqb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110301 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110342 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110352 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282052 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerDied","Data":"6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282103 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282103 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerDied","Data":"108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283672 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285496 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerDied","Data":"b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285621 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.365877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.365924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.406444 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.418686 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.226195 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301483 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" exitCode=0 Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301575 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301583 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301626 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"e82f1532e1a1c38107bb859460f4520da9baccf34fa7c549aea69d074c192f66"} Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302555 4755 scope.go:117] "RemoveContainer" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.304001 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.304526 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.334129 4755 scope.go:117] "RemoveContainer" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.363685 4755 scope.go:117] "RemoveContainer" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.394849 4755 scope.go:117] "RemoveContainer" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.420048 4755 scope.go:117] "RemoveContainer" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.421091 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": container with ID starting with c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b not found: ID does not exist" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421140 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} err="failed to get container status \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": rpc error: code = NotFound desc = could not find container \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": container with ID starting with c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421176 4755 scope.go:117] "RemoveContainer" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.421815 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": container with ID starting with d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e not found: ID does not exist" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421882 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} err="failed to get container status \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": rpc error: code = NotFound desc = could not find container \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": container with ID starting with d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421918 4755 scope.go:117] "RemoveContainer" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.422231 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": container with ID starting with 028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7 not found: ID does not exist" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} err="failed to get container status \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": rpc error: code = NotFound desc = could not find container \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": container with ID starting with 028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7 not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422269 4755 scope.go:117] "RemoveContainer" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.422634 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": container with ID starting with 3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5 not found: ID does not exist" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422709 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} err="failed to get container status \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": rpc error: code = NotFound desc = could not find container \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": container with ID starting with 3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5 not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.436888 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437038 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437080 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437173 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437380 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437650 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437932 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.438217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.445461 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts" (OuterVolumeSpecName: "scripts") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.455761 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.458242 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9" (OuterVolumeSpecName: "kube-api-access-ctmh9") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "kube-api-access-ctmh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.469499 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.476562 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539299 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539329 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539338 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539349 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.550537 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.611619 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data" (OuterVolumeSpecName: "config-data") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.640831 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.640870 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.934853 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.949112 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.978031 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.978448 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.978464 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983851 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983871 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983889 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983897 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983910 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983918 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983936 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983944 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983958 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983965 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983984 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983993 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984005 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984012 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984028 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984036 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984053 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984429 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984450 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984464 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984477 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984492 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984501 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984529 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984539 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984554 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.986508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.986640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.989243 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.989762 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150791 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.151005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.151028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.254299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.255265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.258553 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.258679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.259100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.271846 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.275907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.306798 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: W0320 13:51:30.967458 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69454aac_1cd3_4905_84a8_9798dce108a6.slice/crio-bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf WatchSource:0}: Error finding container bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf: Status 404 returned error can't find the container with id bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.982145 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.237929 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" path="/var/lib/kubelet/pods/93514dc3-0a66-4347-9dba-f787f875cd5c/volumes" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf"} Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358106 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358119 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.403476 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.405898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:32 crc kubenswrapper[4755]: I0320 13:51:32.366953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.377700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.917065 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.918733 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xvvcw" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921528 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921846 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.929793 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.146748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.149070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.152588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.167548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.360310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.390711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.909213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:34 crc kubenswrapper[4755]: W0320 13:51:34.914105 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaef786e_b221_4fff_8d48_42b8163ed86a.slice/crio-468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467 WatchSource:0}: Error finding container 468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467: Status 404 returned error can't find the container with id 468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467 Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.940552 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:35 crc kubenswrapper[4755]: I0320 13:51:35.400892 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerStarted","Data":"468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467"} Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.434433 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435186 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" containerID="cri-o://bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435548 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435862 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" containerID="cri-o://89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435906 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" containerID="cri-o://c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435935 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" containerID="cri-o://65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.471247 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.2766984089999998 podStartE2EDuration="8.471224959s" podCreationTimestamp="2026-03-20 13:51:29 +0000 UTC" firstStartedPulling="2026-03-20 13:51:30.970731708 +0000 UTC m=+1270.568664237" lastFinishedPulling="2026-03-20 13:51:36.165258258 +0000 UTC m=+1275.763190787" observedRunningTime="2026-03-20 13:51:37.462120725 +0000 UTC m=+1277.060053254" watchObservedRunningTime="2026-03-20 13:51:37.471224959 +0000 UTC m=+1277.069157488" Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.446934 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" exitCode=0 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447253 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" exitCode=2 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447265 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" exitCode=0 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447285 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447314 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447323 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} Mar 20 13:51:44 crc kubenswrapper[4755]: I0320 13:51:44.505708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerStarted","Data":"109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28"} Mar 20 13:51:44 crc kubenswrapper[4755]: I0320 13:51:44.531764 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" podStartSLOduration=2.555260545 podStartE2EDuration="11.531739763s" podCreationTimestamp="2026-03-20 13:51:33 +0000 UTC" firstStartedPulling="2026-03-20 13:51:34.916796229 +0000 UTC m=+1274.514728758" lastFinishedPulling="2026-03-20 13:51:43.893275457 +0000 UTC m=+1283.491207976" observedRunningTime="2026-03-20 13:51:44.527621198 +0000 UTC m=+1284.125553737" watchObservedRunningTime="2026-03-20 13:51:44.531739763 +0000 UTC m=+1284.129672312" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.165337 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275468 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275596 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.276413 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.276626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.282950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k" (OuterVolumeSpecName: "kube-api-access-62f6k") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "kube-api-access-62f6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.283745 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts" (OuterVolumeSpecName: "scripts") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.326298 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379084 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379128 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379140 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379151 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379433 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.414873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data" (OuterVolumeSpecName: "config-data") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.481201 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.481245 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.518589 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" exitCode=0 Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.518749 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf"} Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519949 4755 scope.go:117] "RemoveContainer" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.551012 4755 scope.go:117] "RemoveContainer" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.560622 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.575932 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.584752 4755 scope.go:117] "RemoveContainer" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.590847 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599077 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599109 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599121 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599127 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599143 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599149 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599160 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599338 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599346 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599353 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599364 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.600939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.617941 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.618341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.618478 4755 scope.go:117] "RemoveContainer" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.639850 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.656390 4755 scope.go:117] "RemoveContainer" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.656968 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": container with ID starting with 89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378 not found: ID does not exist" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.656996 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} err="failed to get container status \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": rpc error: code = NotFound desc = could not find container \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": container with ID starting with 89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378 not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657017 4755 scope.go:117] "RemoveContainer" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.657616 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": container with ID starting with c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d not found: ID does not exist" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657633 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} err="failed to get container status \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": rpc error: code = NotFound desc = could not find container \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": container with ID starting with c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657645 4755 scope.go:117] "RemoveContainer" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.657815 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": container with ID starting with 65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb not found: ID does not exist" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657828 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} err="failed to get container status \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": rpc error: code = NotFound desc = could not find container \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": container with ID starting with 65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657839 4755 scope.go:117] "RemoveContainer" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.660920 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": container with ID starting with bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4 not found: ID does not exist" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.660957 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} err="failed to get container status \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": rpc error: code = NotFound desc = could not find container \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": container with ID starting with bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4 not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.688877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689847 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791708 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.792214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.792282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.795328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.795720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.796007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.796596 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.815488 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.936691 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:46 crc kubenswrapper[4755]: I0320 13:51:46.402044 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:46 crc kubenswrapper[4755]: I0320 13:51:46.529664 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"c25884d58a94c709373c3a5fa0d84cf0c5b160aab2b5f055314dff5c1917244c"} Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.236811 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" path="/var/lib/kubelet/pods/69454aac-1cd3-4905-84a8-9798dce108a6/volumes" Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.376477 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.543256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} Mar 20 13:51:48 crc kubenswrapper[4755]: I0320 13:51:48.553026 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} Mar 20 13:51:49 crc kubenswrapper[4755]: I0320 13:51:49.563909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601949 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" containerID="cri-o://7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602020 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" containerID="cri-o://3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602198 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" containerID="cri-o://d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602379 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601960 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" containerID="cri-o://3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.654864 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.589498097 podStartE2EDuration="7.654844766s" podCreationTimestamp="2026-03-20 13:51:45 +0000 UTC" firstStartedPulling="2026-03-20 13:51:46.403155589 +0000 UTC m=+1286.001088138" lastFinishedPulling="2026-03-20 13:51:51.468502268 +0000 UTC m=+1291.066434807" observedRunningTime="2026-03-20 13:51:52.641347158 +0000 UTC m=+1292.239279707" watchObservedRunningTime="2026-03-20 13:51:52.654844766 +0000 UTC m=+1292.252777285" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.347679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.449880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.449932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450044 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450082 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450281 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450813 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.456412 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt" (OuterVolumeSpecName: "kube-api-access-ssgtt") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "kube-api-access-ssgtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.456490 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts" (OuterVolumeSpecName: "scripts") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.487083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.523218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.547118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data" (OuterVolumeSpecName: "config-data") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552479 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552511 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552528 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552539 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552552 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552562 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552575 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616851 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616897 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" exitCode=2 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616907 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616914 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616906 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"c25884d58a94c709373c3a5fa0d84cf0c5b160aab2b5f055314dff5c1917244c"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617006 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617024 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.637451 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.681699 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.690936 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.706868 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.717351 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.725787 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726493 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.726598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726734 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.726808 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726907 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727008 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.727099 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727164 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727440 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727544 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727615 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727718 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.731026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.738613 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.739553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.739553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.809748 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.810361 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.810428 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.810471 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811028 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811063 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811086 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811438 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811475 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811493 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811804 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811839 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811866 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812221 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812244 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812575 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812596 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812842 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812860 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813053 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813097 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813317 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813335 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813566 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813591 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813850 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813874 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814311 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814333 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814563 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814582 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814792 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814814 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815006 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815024 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815265 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857754 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959843 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.960387 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.960626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.965645 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.966572 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.967173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.967361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.980801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.112553 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:54 crc kubenswrapper[4755]: W0320 13:51:54.605566 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886eb096_8aa3_423b_b611_03cc592de1d0.slice/crio-05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29 WatchSource:0}: Error finding container 05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29: Status 404 returned error can't find the container with id 05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29 Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.607139 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.631222 4755 generic.go:334] "Generic (PLEG): container finished" podID="faef786e-b221-4fff-8d48-42b8163ed86a" containerID="109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28" exitCode=0 Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.631293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerDied","Data":"109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28"} Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.634385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29"} Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.243418 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" path="/var/lib/kubelet/pods/1221c1db-7d43-4307-928b-1360577fcbe7/volumes" Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.645911 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.994478 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108067 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108266 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.113863 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts" (OuterVolumeSpecName: "scripts") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.116113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj" (OuterVolumeSpecName: "kube-api-access-wf2pj") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "kube-api-access-wf2pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.139880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.178917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data" (OuterVolumeSpecName: "config-data") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211757 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211815 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211831 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211848 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.658568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerDied","Data":"468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467"} Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.660118 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.660171 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.665447 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.745944 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:56 crc kubenswrapper[4755]: E0320 13:51:56.746637 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.746780 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.747149 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.747948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.750418 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.752359 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xvvcw" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.760951 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825407 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927644 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.932868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.933193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.947556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.066923 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.539293 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:57 crc kubenswrapper[4755]: W0320 13:51:57.540382 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676b01c6_a64d_4530_b157_10160afd719a.slice/crio-622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66 WatchSource:0}: Error finding container 622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66: Status 404 returned error can't find the container with id 622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66 Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.680154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.681789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"676b01c6-a64d-4530-b157-10160afd719a","Type":"ContainerStarted","Data":"622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66"} Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.706371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"676b01c6-a64d-4530-b157-10160afd719a","Type":"ContainerStarted","Data":"6a245ce1a50d214ae90c1d6f845a8ed04a969d3c3195667e4091a43718f806b1"} Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.708937 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.736292 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.736273026 podStartE2EDuration="2.736273026s" podCreationTimestamp="2026-03-20 13:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:58.72372272 +0000 UTC m=+1298.321655289" watchObservedRunningTime="2026-03-20 13:51:58.736273026 +0000 UTC m=+1298.334205555" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.133438 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.135026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.139615 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.139937 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.140042 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.144710 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.203998 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.306013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.326839 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.453951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.726575 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.727168 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.771340 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.957171275 podStartE2EDuration="7.77131784s" podCreationTimestamp="2026-03-20 13:51:53 +0000 UTC" firstStartedPulling="2026-03-20 13:51:54.610406502 +0000 UTC m=+1294.208339071" lastFinishedPulling="2026-03-20 13:51:59.424553087 +0000 UTC m=+1299.022485636" observedRunningTime="2026-03-20 13:52:00.768098917 +0000 UTC m=+1300.366031446" watchObservedRunningTime="2026-03-20 13:52:00.77131784 +0000 UTC m=+1300.369250369" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.960742 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:01 crc kubenswrapper[4755]: I0320 13:52:01.737495 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerStarted","Data":"b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f"} Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.117297 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.577494 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.578583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.582337 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.591985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.607171 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.665882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.757207 4755 generic.go:334] "Generic (PLEG): container finished" podID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerID="72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608" exitCode=0 Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.757274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerDied","Data":"72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608"} Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.769996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770112 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.785336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.787365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.798294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.803574 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.837819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.842779 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.857266 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.860436 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.902869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.952519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.954170 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.955507 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.958725 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.981813 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.991017 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.036863 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.107595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108056 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108175 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108229 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.122511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.137039 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.138682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.139930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.143028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.144755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.161554 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.206100 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211177 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211254 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.216489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.230095 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.246365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.253131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.271722 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.273594 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.274560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.298126 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.301348 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.301470 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314927 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.319338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.323290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.326950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.342474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.381632 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.411104 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.425914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.425963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545309 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545602 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.547173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.548105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.549113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.550007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.550812 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.598673 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.607638 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.633544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: W0320 13:52:03.699048 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff73477_b65b_4362_938c_94b1bb1f51b0.slice/crio-d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73 WatchSource:0}: Error finding container d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73: Status 404 returned error can't find the container with id d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73 Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.702957 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.808057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerStarted","Data":"d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73"} Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.910350 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.914331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.917503 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.921121 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.924630 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.930955 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070872 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.075190 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.186974 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.187972 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.189215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.196215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.210351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.320783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.326488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.411775 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.417446 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:04 crc kubenswrapper[4755]: W0320 13:52:04.453737 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dab838_4670_45f3_8276_240f4266194d.slice/crio-a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9 WatchSource:0}: Error finding container a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9: Status 404 returned error can't find the container with id a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.587517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"78cf774b-eb80-4f5b-a7de-2012636d36c5\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.594807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz" (OuterVolumeSpecName: "kube-api-access-jx4bz") pod "78cf774b-eb80-4f5b-a7de-2012636d36c5" (UID: "78cf774b-eb80-4f5b-a7de-2012636d36c5"). InnerVolumeSpecName "kube-api-access-jx4bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.690171 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.879023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerStarted","Data":"4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.891675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerStarted","Data":"07fce25f45c6fe707885052798e47cbce52b19aa7717044f52d0bb81703a15ba"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.912631 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vz8fw" podStartSLOduration=2.912597094 podStartE2EDuration="2.912597094s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:04.899047113 +0000 UTC m=+1304.496979642" watchObservedRunningTime="2026-03-20 13:52:04.912597094 +0000 UTC m=+1304.510529623" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.912942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.920834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerStarted","Data":"ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930185 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerDied","Data":"b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930243 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.931861 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.936988 4755 generic.go:334] "Generic (PLEG): container finished" podID="24dab838-4670-45f3-8276-240f4266194d" containerID="1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16" exitCode=0 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.937175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.937258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerStarted","Data":"a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9"} Mar 20 13:52:04 crc kubenswrapper[4755]: W0320 13:52:04.939227 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcadbdc7c_ed66_43d7_82ee_d797beb959a8.slice/crio-01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636 WatchSource:0}: Error finding container 01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636: Status 404 returned error can't find the container with id 01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.955894 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"d305ab31e9622ed372defac60b08de6826298aff9ba6d4afc85a9d25d074e86a"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.500209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.522208 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.981041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerStarted","Data":"8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.981292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerStarted","Data":"01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.990748 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerStarted","Data":"62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d"} Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.010515 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" podStartSLOduration=3.010486787 podStartE2EDuration="3.010486787s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:06.003302282 +0000 UTC m=+1305.601234831" watchObservedRunningTime="2026-03-20 13:52:06.010486787 +0000 UTC m=+1305.608419326" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.035108 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-726q6" podStartSLOduration=3.035085355 podStartE2EDuration="3.035085355s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:06.030109896 +0000 UTC m=+1305.628042455" watchObservedRunningTime="2026-03-20 13:52:06.035085355 +0000 UTC m=+1305.633017884" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.651277 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.661313 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.751927 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.752018 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.999361 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:07 crc kubenswrapper[4755]: I0320 13:52:07.256057 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" path="/var/lib/kubelet/pods/47f9ab28-1218-4dcf-a989-728b9063a3e9/volumes" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.007526 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.008005 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.009898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerStarted","Data":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.018213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.021020 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" gracePeriod=30 Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.021325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerStarted","Data":"56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.043363 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.783444717 podStartE2EDuration="6.043333235s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.090688149 +0000 UTC m=+1303.688620678" lastFinishedPulling="2026-03-20 13:52:07.350576667 +0000 UTC m=+1306.948509196" observedRunningTime="2026-03-20 13:52:08.032375361 +0000 UTC m=+1307.630307890" watchObservedRunningTime="2026-03-20 13:52:08.043333235 +0000 UTC m=+1307.641265764" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.129336 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.968323458 podStartE2EDuration="6.129310843s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.188561165 +0000 UTC m=+1303.786493694" lastFinishedPulling="2026-03-20 13:52:07.34954855 +0000 UTC m=+1306.947481079" observedRunningTime="2026-03-20 13:52:08.095105447 +0000 UTC m=+1307.693037976" watchObservedRunningTime="2026-03-20 13:52:08.129310843 +0000 UTC m=+1307.727243372" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.131140 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.71486039 podStartE2EDuration="6.131133559s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:03.929332109 +0000 UTC m=+1303.527264638" lastFinishedPulling="2026-03-20 13:52:07.345605278 +0000 UTC m=+1306.943537807" observedRunningTime="2026-03-20 13:52:08.125154205 +0000 UTC m=+1307.723086734" watchObservedRunningTime="2026-03-20 13:52:08.131133559 +0000 UTC m=+1307.729066088" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.207146 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.411788 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.034589 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.035014 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" containerID="cri-o://f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" gracePeriod=30 Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.035022 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" containerID="cri-o://9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" gracePeriod=30 Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.076709 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.058379867 podStartE2EDuration="6.076646666s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.332221216 +0000 UTC m=+1303.930153745" lastFinishedPulling="2026-03-20 13:52:07.350488015 +0000 UTC m=+1306.948420544" observedRunningTime="2026-03-20 13:52:09.065928188 +0000 UTC m=+1308.663860717" watchObservedRunningTime="2026-03-20 13:52:09.076646666 +0000 UTC m=+1308.674579195" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.723160 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832427 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832644 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.833639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs" (OuterVolumeSpecName: "logs") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.854053 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v" (OuterVolumeSpecName: "kube-api-access-5vf8v") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "kube-api-access-5vf8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.882067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.883394 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data" (OuterVolumeSpecName: "config-data") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936002 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936036 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936047 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936057 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069371 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4601b20-9dc6-41dd-ab44-f10600003906" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" exitCode=0 Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069471 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4601b20-9dc6-41dd-ab44-f10600003906" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" exitCode=143 Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069629 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069816 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069840 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.094885 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.123911 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.124738 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.124865 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} err="failed to get container status \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.124921 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.125362 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.125413 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} err="failed to get container status \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.125446 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} err="failed to get container status \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126255 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126642 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} err="failed to get container status \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.142232 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.148943 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.164561 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.164982 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.164998 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.165018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165025 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.165033 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165039 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165213 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165223 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165237 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.166212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179307 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179512 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179793 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.246567 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.246765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247062 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.331030 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4601b20_9dc6_41dd_ab44_f10600003906.slice/crio-1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753\": RecentStats: unable to find data in memory cache]" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.350236 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.350420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.354328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.354521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.369018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.380223 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.509090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.051328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.090542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"7a21afc95b0624df6ce1d9bc13f6bf0f3fd81506690ec7d242284e3e4ee61373"} Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.239788 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" path="/var/lib/kubelet/pods/d4601b20-9dc6-41dd-ab44-f10600003906/volumes" Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.103153 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.103495 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.132499 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.132472547 podStartE2EDuration="2.132472547s" podCreationTimestamp="2026-03-20 13:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:12.124416988 +0000 UTC m=+1311.722349537" watchObservedRunningTime="2026-03-20 13:52:12.132472547 +0000 UTC m=+1311.730405096" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.115939 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerID="4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a" exitCode=0 Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.116048 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerDied","Data":"4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a"} Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.206942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.246303 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.382588 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.382634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.635698 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.698746 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.698979 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-656mk" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" containerID="cri-o://fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" gracePeriod=10 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.130407 4755 generic.go:334] "Generic (PLEG): container finished" podID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerID="8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718" exitCode=0 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.130576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerDied","Data":"8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718"} Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.137121 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerID="fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" exitCode=0 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.138031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437"} Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.177164 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.288906 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.340610 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341464 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.358129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6" (OuterVolumeSpecName: "kube-api-access-czrk6") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "kube-api-access-czrk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.444306 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.444543 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.446186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.450441 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.465855 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.465887 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.488399 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config" (OuterVolumeSpecName: "config") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.493617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.497694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545447 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546030 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546056 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546071 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546084 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546096 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.548992 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts" (OuterVolumeSpecName: "scripts") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.551370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc" (OuterVolumeSpecName: "kube-api-access-jz6sc") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "kube-api-access-jz6sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.576213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.584971 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data" (OuterVolumeSpecName: "config-data") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648269 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648305 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648317 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648326 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"0ac451aa4ed8d677d77946a6a4c4490aa16c5aad1720a8d22a9ecfc0acddbe6e"} Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157427 4755 scope.go:117] "RemoveContainer" containerID="fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157606 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.176826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerDied","Data":"d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73"} Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.177346 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.176960 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.214482 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.226030 4755 scope.go:117] "RemoveContainer" containerID="74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.235747 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357018 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357237 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" containerID="cri-o://51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357786 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" containerID="cri-o://a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.398204 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.445022 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.445466 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" containerID="cri-o://786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.446138 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" containerID="cri-o://b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.555029 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.679908 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.687455 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p" (OuterVolumeSpecName: "kube-api-access-ccz2p") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "kube-api-access-ccz2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.697844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts" (OuterVolumeSpecName: "scripts") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.714768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.718841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data" (OuterVolumeSpecName: "config-data") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782812 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782889 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782903 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782918 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.033624 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088242 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088351 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088488 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088603 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.089403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs" (OuterVolumeSpecName: "logs") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.089913 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.095598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv" (OuterVolumeSpecName: "kube-api-access-2f9jv") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "kube-api-access-2f9jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.126050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data" (OuterVolumeSpecName: "config-data") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.133093 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.148568 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194774 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194876 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194893 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194936 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.197380 4755 generic.go:334] "Generic (PLEG): container finished" podID="9039b999-a68c-4920-af85-ac61d8509b06" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" exitCode=143 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.197398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201889 4755 generic.go:334] "Generic (PLEG): container finished" podID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" exitCode=0 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201911 4755 generic.go:334] "Generic (PLEG): container finished" podID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" exitCode=143 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201949 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"7a21afc95b0624df6ce1d9bc13f6bf0f3fd81506690ec7d242284e3e4ee61373"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.202008 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerDied","Data":"01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204341 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204348 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204420 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" containerID="cri-o://e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" gracePeriod=30 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.237057 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245160 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245720 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245744 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245760 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245768 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245786 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245793 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245817 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245836 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245852 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="init" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245860 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="init" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245870 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245878 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246076 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246098 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246109 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246135 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246155 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246896 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.250225 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.273091 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.283622 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.287627 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.288152 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.288191 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} err="failed to get container status \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.288219 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.293764 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293813 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} err="failed to get container status \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293847 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293977 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294391 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} err="failed to get container status \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294434 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294917 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} err="failed to get container status \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.321793 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.323622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.326081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.326406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.336864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398708 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398913 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398951 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.402785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.402862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.415511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500683 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.501278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.505127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.505272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.506173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.525467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.578616 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.641240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.061102 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:17 crc kubenswrapper[4755]: W0320 13:52:17.067866 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32aa4c4f_3c67_46f5_90ae_59d17077eb1d.slice/crio-08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f WatchSource:0}: Error finding container 08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f: Status 404 returned error can't find the container with id 08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f Mar 20 13:52:17 crc kubenswrapper[4755]: W0320 13:52:17.179573 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc55c1b8_6ed7_41ba_b5a6_8fe3f03fe3c7.slice/crio-b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab WatchSource:0}: Error finding container b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab: Status 404 returned error can't find the container with id b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.181189 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.216768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab"} Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.218704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"32aa4c4f-3c67-46f5-90ae-59d17077eb1d","Type":"ContainerStarted","Data":"08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f"} Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.246437 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" path="/var/lib/kubelet/pods/56052c23-c9d5-4eba-9696-13d244f6cf97/volumes" Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.248086 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" path="/var/lib/kubelet/pods/f4e36ff1-5396-4e15-ad2f-6312bc653076/volumes" Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.210137 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.212167 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.213567 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.213617 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.232427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"32aa4c4f-3c67-46f5-90ae-59d17077eb1d","Type":"ContainerStarted","Data":"561e9d4a927852e54a364b78bdf7d50a740fd277b73deb5ad2d594176e1f9238"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.232637 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.239506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.240063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.262829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.262805423 podStartE2EDuration="2.262805423s" podCreationTimestamp="2026-03-20 13:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:18.258748828 +0000 UTC m=+1317.856681417" watchObservedRunningTime="2026-03-20 13:52:18.262805423 +0000 UTC m=+1317.860737962" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.290888 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.29086228 podStartE2EDuration="2.29086228s" podCreationTimestamp="2026-03-20 13:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:18.288838207 +0000 UTC m=+1317.886770756" watchObservedRunningTime="2026-03-20 13:52:18.29086228 +0000 UTC m=+1317.888794849" Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.859820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.973890 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.973962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.974158 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.980257 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n" (OuterVolumeSpecName: "kube-api-access-lpf6n") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "kube-api-access-lpf6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.001882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data" (OuterVolumeSpecName: "config-data") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.038509 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075911 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075950 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075960 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.252078 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.260907 4755 generic.go:334] "Generic (PLEG): container finished" podID="9039b999-a68c-4920-af85-ac61d8509b06" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" exitCode=0 Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261059 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"d305ab31e9622ed372defac60b08de6826298aff9ba6d4afc85a9d25d074e86a"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261398 4755 scope.go:117] "RemoveContainer" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.264372 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c488527-be33-4a36-a073-1a49802e28dd" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" exitCode=0 Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.265104 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.269584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerDied","Data":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.269725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerDied","Data":"07fce25f45c6fe707885052798e47cbce52b19aa7717044f52d0bb81703a15ba"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279507 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279614 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279798 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279853 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.280750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs" (OuterVolumeSpecName: "logs") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.287057 4755 scope.go:117] "RemoveContainer" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.288151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85" (OuterVolumeSpecName: "kube-api-access-jwb85") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "kube-api-access-jwb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.319781 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.322359 4755 scope.go:117] "RemoveContainer" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.322365 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.323525 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": container with ID starting with a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5 not found: ID does not exist" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.323587 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} err="failed to get container status \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": rpc error: code = NotFound desc = could not find container \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": container with ID starting with a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.323624 4755 scope.go:117] "RemoveContainer" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.324669 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": container with ID starting with 51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831 not found: ID does not exist" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.324736 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} err="failed to get container status \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": rpc error: code = NotFound desc = could not find container \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": container with ID starting with 51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.324758 4755 scope.go:117] "RemoveContainer" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.331680 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.339800 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data" (OuterVolumeSpecName: "config-data") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.356871 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357416 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357433 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357455 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357464 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357487 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357716 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357739 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357757 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.358528 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.360419 4755 scope.go:117] "RemoveContainer" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.361193 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.362122 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": container with ID starting with e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57 not found: ID does not exist" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.362230 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} err="failed to get container status \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": rpc error: code = NotFound desc = could not find container \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": container with ID starting with e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383491 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383505 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383521 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383532 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.486613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.487282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.487409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.489849 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.491021 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.503695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.640839 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.658982 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.676547 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.678264 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.678373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.681013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.685037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.794810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.794884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.795007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.795080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896786 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.898133 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.904197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.911180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:20.921943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.004036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.182553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:21 crc kubenswrapper[4755]: W0320 13:52:21.184837 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b22e0c3_341e_444d_a615_50d5ccdc9f12.slice/crio-0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82 WatchSource:0}: Error finding container 0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82: Status 404 returned error can't find the container with id 0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82 Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.240319 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c488527-be33-4a36-a073-1a49802e28dd" path="/var/lib/kubelet/pods/5c488527-be33-4a36-a073-1a49802e28dd/volumes" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.242019 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9039b999-a68c-4920-af85-ac61d8509b06" path="/var/lib/kubelet/pods/9039b999-a68c-4920-af85-ac61d8509b06/volumes" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.291029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerStarted","Data":"0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.088546 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.308586 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.308637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"69613a5a056ceb190560dd090764a72a502dd1b9d118cfd002538e2432b55f6e"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.310147 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerStarted","Data":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.325181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.357827 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.357795775 podStartE2EDuration="3.357795775s" podCreationTimestamp="2026-03-20 13:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:22.338126167 +0000 UTC m=+1321.936058696" watchObservedRunningTime="2026-03-20 13:52:23.357795775 +0000 UTC m=+1322.955728344" Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.360910 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.360896155 podStartE2EDuration="3.360896155s" podCreationTimestamp="2026-03-20 13:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:23.346103862 +0000 UTC m=+1322.944036411" watchObservedRunningTime="2026-03-20 13:52:23.360896155 +0000 UTC m=+1322.958828724" Mar 20 13:52:24 crc kubenswrapper[4755]: I0320 13:52:24.125849 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:52:25 crc kubenswrapper[4755]: I0320 13:52:25.686352 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.610153 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.641680 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.641733 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.162188 4755 scope.go:117] "RemoveContainer" containerID="cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.657820 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.657835 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.233499 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.234241 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" containerID="cri-o://8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" gracePeriod=30 Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.381478 4755 generic.go:334] "Generic (PLEG): container finished" podID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerID="8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" exitCode=2 Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.381519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerDied","Data":"8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28"} Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.782358 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.888752 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.897213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw" (OuterVolumeSpecName: "kube-api-access-4tpmw") pod "5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" (UID: "5ac4bdab-eaee-4ee6-a3e1-2f754c179d60"). InnerVolumeSpecName "kube-api-access-4tpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.990762 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerDied","Data":"1f936cfbd135019d1572ee465a4fb61fade57721a1a7701a47ec15a9bf86c1cd"} Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392191 4755 scope.go:117] "RemoveContainer" containerID="8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392199 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.422050 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.434278 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.438751 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: E0320 13:52:29.439228 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.439252 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.439480 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.440140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.449534 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.457395 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.461400 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499300 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499453 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.615978 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.616013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.616526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.619089 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.757922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.209359 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210312 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" containerID="cri-o://d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210396 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" containerID="cri-o://d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210551 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" containerID="cri-o://a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210410 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" containerID="cri-o://eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.264287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.401543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f27a8a2-0755-47ae-a7b4-63787c8c9393","Type":"ContainerStarted","Data":"b92695a785998a573c5e5b231bdac4b7f5867d17cc52d242074f5f1bbe5bbca6"} Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.409606 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" exitCode=2 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.409686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.685938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.718295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.005550 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.005599 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.237904 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" path="/var/lib/kubelet/pods/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60/volumes" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.421580 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.423089 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.421624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.423309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.425276 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f27a8a2-0755-47ae-a7b4-63787c8c9393","Type":"ContainerStarted","Data":"37e1a33a39052d95fffee5fa310d247339624777ba841f4ced83f91cbf750277"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.425611 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.454570 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.073128355 podStartE2EDuration="2.454548516s" podCreationTimestamp="2026-03-20 13:52:29 +0000 UTC" firstStartedPulling="2026-03-20 13:52:30.278029115 +0000 UTC m=+1329.875961644" lastFinishedPulling="2026-03-20 13:52:30.659449286 +0000 UTC m=+1330.257381805" observedRunningTime="2026-03-20 13:52:31.442449413 +0000 UTC m=+1331.040381942" watchObservedRunningTime="2026-03-20 13:52:31.454548516 +0000 UTC m=+1331.052481055" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.491568 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:52:32 crc kubenswrapper[4755]: I0320 13:52:32.087879 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:32 crc kubenswrapper[4755]: I0320 13:52:32.087919 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:34 crc kubenswrapper[4755]: I0320 13:52:34.642412 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:52:34 crc kubenswrapper[4755]: I0320 13:52:34.642941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.435952 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492630 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" exitCode=0 Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29"} Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492763 4755 scope.go:117] "RemoveContainer" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492963 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.519941 4755 scope.go:117] "RemoveContainer" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525064 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525157 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525215 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525247 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525495 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525878 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.526854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.540497 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs" (OuterVolumeSpecName: "kube-api-access-5n5fs") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "kube-api-access-5n5fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.542267 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts" (OuterVolumeSpecName: "scripts") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.557258 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.560018 4755 scope.go:117] "RemoveContainer" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.607640 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630827 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630856 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630866 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630875 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.635403 4755 scope.go:117] "RemoveContainer" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.641348 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data" (OuterVolumeSpecName: "config-data") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662212 4755 scope.go:117] "RemoveContainer" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.662508 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": container with ID starting with eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398 not found: ID does not exist" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662533 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} err="failed to get container status \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": rpc error: code = NotFound desc = could not find container \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": container with ID starting with eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662556 4755 scope.go:117] "RemoveContainer" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.662919 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": container with ID starting with d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44 not found: ID does not exist" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662978 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} err="failed to get container status \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": rpc error: code = NotFound desc = could not find container \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": container with ID starting with d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663034 4755 scope.go:117] "RemoveContainer" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.663549 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": container with ID starting with a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0 not found: ID does not exist" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663572 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} err="failed to get container status \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": rpc error: code = NotFound desc = could not find container \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": container with ID starting with a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663587 4755 scope.go:117] "RemoveContainer" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.663824 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": container with ID starting with d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5 not found: ID does not exist" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663859 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} err="failed to get container status \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": rpc error: code = NotFound desc = could not find container \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": container with ID starting with d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.732305 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.830520 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.838269 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.866617 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867093 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867109 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867115 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867127 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867134 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867163 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867169 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867487 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867514 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867537 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.871231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.873295 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.873510 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.875914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.910539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935642 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037821 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038020 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038525 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.041984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.042163 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.043438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.044197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.046546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.057206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.193866 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.647312 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.692368 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.693342 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.751498 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.751567 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.774541 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.236582 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" path="/var/lib/kubelet/pods/886eb096-8aa3-423b-b611-03cc592de1d0/volumes" Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.517199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"94bfcab311a4ee3e71d6c186d7867de80fdb4abfe47c1e419a433ca5d4a60238"} Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.524622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.528250 4755 generic.go:334] "Generic (PLEG): container finished" podID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerID="56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" exitCode=137 Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.528316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerDied","Data":"56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.529325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerDied","Data":"ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.529342 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532057 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532115 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.593802 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.594041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.594093 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.598145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn" (OuterVolumeSpecName: "kube-api-access-5pgcn") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "kube-api-access-5pgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.621890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.639929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data" (OuterVolumeSpecName: "config-data") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697515 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697544 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697555 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.004859 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.004915 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.547186 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.547194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.573199 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.589272 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608009 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: E0320 13:52:39.608409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608426 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608674 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.609334 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.611446 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.611535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.613771 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.632426 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717674 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.718122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.768709 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820434 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.829799 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.841401 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.845328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.851175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.863215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.930192 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:40 crc kubenswrapper[4755]: I0320 13:52:40.382899 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:40 crc kubenswrapper[4755]: I0320 13:52:40.556781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8318edf5-5648-4c19-8853-3d555435ed6f","Type":"ContainerStarted","Data":"7f512918bbc22e31ca0c18f92be8801e701687c7309119e0bebcd5b0ee178fe2"} Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.026810 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.031563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.035206 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.240121 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" path="/var/lib/kubelet/pods/b54ba84f-e5e3-48ba-b283-2c37348fef90/volumes" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.570462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8318edf5-5648-4c19-8853-3d555435ed6f","Type":"ContainerStarted","Data":"8e5c202f3642f29bd6ba37127fc1deb176e98ed39a109baf7f1c56e5ccff9652"} Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.577216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.598371 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.598346093 podStartE2EDuration="2.598346093s" podCreationTimestamp="2026-03-20 13:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:41.587054561 +0000 UTC m=+1341.184987100" watchObservedRunningTime="2026-03-20 13:52:41.598346093 +0000 UTC m=+1341.196278632" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.905807 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.907306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.916066 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054225 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054390 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155964 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.156057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.156160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.158878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.163742 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.163879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.164229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.169540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.173014 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.378512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.597807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.598071 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.625164 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.616080841 podStartE2EDuration="7.625146586s" podCreationTimestamp="2026-03-20 13:52:35 +0000 UTC" firstStartedPulling="2026-03-20 13:52:36.791337623 +0000 UTC m=+1336.389270152" lastFinishedPulling="2026-03-20 13:52:41.800403368 +0000 UTC m=+1341.398335897" observedRunningTime="2026-03-20 13:52:42.618304088 +0000 UTC m=+1342.216236637" watchObservedRunningTime="2026-03-20 13:52:42.625146586 +0000 UTC m=+1342.223079115" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.945886 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613365 4755 generic.go:334] "Generic (PLEG): container finished" podID="204ff403-3d73-430e-aa64-a41f033f641e" containerID="12ccd499604429e6658a715e4e378949d8500574fbe5fddc12dbd0637665657f" exitCode=0 Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613454 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerDied","Data":"12ccd499604429e6658a715e4e378949d8500574fbe5fddc12dbd0637665657f"} Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerStarted","Data":"a95fceb205e141c263b018743de45ef8e3832fb1265ea1d6ba1bebf93404366a"} Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.287003 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.551839 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627314 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerStarted","Data":"bf24d2988a5cea00c8fe4990e7a58673b96811d94ae34d804e06829ff9fc740c"} Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627450 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" containerID="cri-o://d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" gracePeriod=30 Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627530 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" containerID="cri-o://8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" gracePeriod=30 Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627530 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.660013 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" podStartSLOduration=3.659995265 podStartE2EDuration="3.659995265s" podCreationTimestamp="2026-03-20 13:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:44.654284167 +0000 UTC m=+1344.252216696" watchObservedRunningTime="2026-03-20 13:52:44.659995265 +0000 UTC m=+1344.257927784" Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.932047 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.636649 4755 generic.go:334] "Generic (PLEG): container finished" podID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" exitCode=143 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.636761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637351 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" containerID="cri-o://af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637385 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" containerID="cri-o://3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637436 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" containerID="cri-o://d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637512 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" containerID="cri-o://6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" gracePeriod=30 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.512331 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.648957 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.648992 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" exitCode=2 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649003 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649011 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649033 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"94bfcab311a4ee3e71d6c186d7867de80fdb4abfe47c1e419a433ca5d4a60238"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649117 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.651905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.651939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652373 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652528 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653127 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653161 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653893 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653913 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.658028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2" (OuterVolumeSpecName: "kube-api-access-mvdc2") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "kube-api-access-mvdc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.658623 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts" (OuterVolumeSpecName: "scripts") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.677991 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.693871 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.707000 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.723166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.741792 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.754588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755546 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755581 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755594 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755604 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755614 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773302 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.773794 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773835 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773861 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774214 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774239 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774360 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774621 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774641 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774682 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774890 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774909 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774922 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775257 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775270 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775445 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775458 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775804 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775822 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776010 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data" (OuterVolumeSpecName: "config-data") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776080 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776101 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776348 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776369 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776701 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776721 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776938 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776958 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777374 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777429 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777733 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777755 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.778007 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.778034 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.779850 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.779890 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.780183 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.857543 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.979773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.986681 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.012762 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013246 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013267 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013276 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013284 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013300 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013308 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013332 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013339 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013504 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013545 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.016001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019119 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.027328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162838 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162979 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.238963 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" path="/var/lib/kubelet/pods/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4/volumes" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266089 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.267823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.268150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.272579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.274341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.284679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.332495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.838797 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: W0320 13:52:47.883046 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c583579_b927_4ef7_bfc9_0c54a2e77bcb.slice/crio-72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d WatchSource:0}: Error finding container 72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d: Status 404 returned error can't find the container with id 72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.245087 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391641 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.392588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs" (OuterVolumeSpecName: "logs") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.395708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7" (OuterVolumeSpecName: "kube-api-access-fl9w7") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "kube-api-access-fl9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.420684 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.433626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data" (OuterVolumeSpecName: "config-data") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493448 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493484 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493494 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493505 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670240 4755 generic.go:334] "Generic (PLEG): container finished" podID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" exitCode=0 Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"69613a5a056ceb190560dd090764a72a502dd1b9d118cfd002538e2432b55f6e"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670410 4755 scope.go:117] "RemoveContainer" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670415 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.673521 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"db7ea86e072e50ae870fae22c864dcd5d14312514d13b8d5b9f71b6ee5eb37c1"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.673566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.721142 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.724547 4755 scope.go:117] "RemoveContainer" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.730529 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755215 4755 scope.go:117] "RemoveContainer" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.755599 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": container with ID starting with 8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46 not found: ID does not exist" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755632 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} err="failed to get container status \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": rpc error: code = NotFound desc = could not find container \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": container with ID starting with 8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46 not found: ID does not exist" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755668 4755 scope.go:117] "RemoveContainer" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.755930 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": container with ID starting with d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8 not found: ID does not exist" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755980 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} err="failed to get container status \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": rpc error: code = NotFound desc = could not find container \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": container with ID starting with d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8 not found: ID does not exist" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.757772 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757793 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.757828 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757836 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.758035 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.758056 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.759272 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.762453 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.762676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.764479 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.768411 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921489 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023524 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023563 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.027477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.033891 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.034106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.034224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.035625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.051298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.128365 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.245498 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" path="/var/lib/kubelet/pods/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0/volumes" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.599120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:49 crc kubenswrapper[4755]: W0320 13:52:49.611339 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb3effa2_e877_484a_8003_06a326a0b48b.slice/crio-8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f WatchSource:0}: Error finding container 8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f: Status 404 returned error can't find the container with id 8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.689773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"2f273839bec72005f082a6cf6998a1a83b499301b33c48c31fd5b1ece7372f03"} Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.693670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f"} Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.939216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.983677 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.706725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"92d5de14150800072f17f3008f4d24092d55c9c324814e9b3e5ef7f104145444"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.721280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.721508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.743758 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.744448 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.744434192 podStartE2EDuration="2.744434192s" podCreationTimestamp="2026-03-20 13:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:50.740762517 +0000 UTC m=+1350.338695046" watchObservedRunningTime="2026-03-20 13:52:50.744434192 +0000 UTC m=+1350.342366741" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.951209 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.952739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.955358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.955550 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.970128 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.205986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.278370 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.734865 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:51 crc kubenswrapper[4755]: W0320 13:52:51.739884 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod557c5385_782c_410a_a371_b27f41d88a47.slice/crio-4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd WatchSource:0}: Error finding container 4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd: Status 404 returned error can't find the container with id 4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.381850 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.465204 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.465435 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-726q6" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" containerID="cri-o://62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" gracePeriod=10 Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.742998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"2350a80dfcdc41ddafcce0834f07da754b792f1d0e614e76d528ca12b6dc8def"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.744768 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.779188 4755 generic.go:334] "Generic (PLEG): container finished" podID="24dab838-4670-45f3-8276-240f4266194d" containerID="62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" exitCode=0 Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.779264 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.789695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerStarted","Data":"93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.789729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerStarted","Data":"4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.792237 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.579152904 podStartE2EDuration="6.792208396s" podCreationTimestamp="2026-03-20 13:52:46 +0000 UTC" firstStartedPulling="2026-03-20 13:52:47.888288864 +0000 UTC m=+1347.486221393" lastFinishedPulling="2026-03-20 13:52:52.101344356 +0000 UTC m=+1351.699276885" observedRunningTime="2026-03-20 13:52:52.769050036 +0000 UTC m=+1352.366982565" watchObservedRunningTime="2026-03-20 13:52:52.792208396 +0000 UTC m=+1352.390140925" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.813422 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7fs4m" podStartSLOduration=2.813407316 podStartE2EDuration="2.813407316s" podCreationTimestamp="2026-03-20 13:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:52.807936094 +0000 UTC m=+1352.405868623" watchObservedRunningTime="2026-03-20 13:52:52.813407316 +0000 UTC m=+1352.411339845" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.024233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.130348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131569 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131825 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131937 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.179032 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc" (OuterVolumeSpecName: "kube-api-access-75cnc") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "kube-api-access-75cnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.187552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config" (OuterVolumeSpecName: "config") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.204247 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.206940 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.208098 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.214208 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235519 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235549 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235558 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235570 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235579 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235588 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9"} Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797438 4755 scope.go:117] "RemoveContainer" containerID="62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797728 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.857294 4755 scope.go:117] "RemoveContainer" containerID="1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.877545 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.889637 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:55 crc kubenswrapper[4755]: I0320 13:52:55.251452 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dab838-4670-45f3-8276-240f4266194d" path="/var/lib/kubelet/pods/24dab838-4670-45f3-8276-240f4266194d/volumes" Mar 20 13:52:56 crc kubenswrapper[4755]: I0320 13:52:56.849848 4755 generic.go:334] "Generic (PLEG): container finished" podID="557c5385-782c-410a-a371-b27f41d88a47" containerID="93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa" exitCode=0 Mar 20 13:52:56 crc kubenswrapper[4755]: I0320 13:52:56.849887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerDied","Data":"93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa"} Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.256583 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389363 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389921 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389973 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.397106 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4" (OuterVolumeSpecName: "kube-api-access-h27h4") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "kube-api-access-h27h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.397152 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts" (OuterVolumeSpecName: "scripts") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.427028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.435215 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data" (OuterVolumeSpecName: "config-data") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493220 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493260 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493273 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493288 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.878870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerDied","Data":"4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd"} Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.879251 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.878970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.112396 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.112792 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" containerID="cri-o://e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.113489 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" containerID="cri-o://95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.119832 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.120091 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" containerID="cri-o://1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.293463 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.293897 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" containerID="cri-o://b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.294028 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" containerID="cri-o://e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.747565 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832212 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832345 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832432 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.834186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs" (OuterVolumeSpecName: "logs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.839081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v" (OuterVolumeSpecName: "kube-api-access-8z59v") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "kube-api-access-8z59v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.868547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.885313 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data" (OuterVolumeSpecName: "config-data") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.900950 4755 generic.go:334] "Generic (PLEG): container finished" podID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerID="b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" exitCode=143 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.901040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904023 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb3effa2-e877-484a-8003-06a326a0b48b" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" exitCode=0 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904077 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb3effa2-e877-484a-8003-06a326a0b48b" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" exitCode=143 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904163 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904200 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.921332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.923350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.936565 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939631 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939703 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939714 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939725 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939733 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939743 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.967508 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: E0320 13:52:59.968787 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.968832 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} err="failed to get container status \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.968864 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: E0320 13:52:59.969636 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.969715 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} err="failed to get container status \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970256 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970682 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} err="failed to get container status \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970704 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970934 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} err="failed to get container status \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.239734 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.247417 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.267273 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268013 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268118 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268215 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="init" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268291 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="init" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268362 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268429 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268506 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268567 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268676 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268743 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269021 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269131 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269213 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269286 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.270578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275550 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275649 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275866 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.291053 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449651 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.450844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.456056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.457968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.459030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.459866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.467527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.647667 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.687179 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.688695 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.690390 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.690449 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.120351 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.234798 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" path="/var/lib/kubelet/pods/fb3effa2-e877-484a-8003-06a326a0b48b/volumes" Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"0811e02ac39c52724b470230398e95a710c8071a3acffc5ce3ca53256fcea5c9"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"f9bd3e6c72d05a8c7eacb7ea7628f8d0cff791fc85e2282eb788e0d4df903a0e"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"2a7e0970fbf965bb9ff6dc1fc817b8cc4ed04376556a205e8e23810c41f0146c"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.950179 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9501637139999999 podStartE2EDuration="1.950163714s" podCreationTimestamp="2026-03-20 13:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:01.948766928 +0000 UTC m=+1361.546699477" watchObservedRunningTime="2026-03-20 13:53:01.950163714 +0000 UTC m=+1361.548096243" Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.937925 4755 generic.go:334] "Generic (PLEG): container finished" podID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerID="e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" exitCode=0 Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b"} Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab"} Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938393 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab" Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.956213 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106891 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs" (OuterVolumeSpecName: "logs") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.108471 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.112124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc" (OuterVolumeSpecName: "kube-api-access-wnvlc") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "kube-api-access-wnvlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.134793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data" (OuterVolumeSpecName: "config-data") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.142118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.162083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210299 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210341 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210361 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210372 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.947459 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.973143 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.982847 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.023819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: E0320 13:53:04.024295 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024321 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: E0320 13:53:04.024350 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024358 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024596 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024621 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.025892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.029281 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.030106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.056209 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126562 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228268 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228563 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.229320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.233283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.242350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.242486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.258010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.379003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.880345 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.886032 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:04 crc kubenswrapper[4755]: W0320 13:53:04.889706 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d8fd29_d89f_4955_86f3_a8137400c67b.slice/crio-3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07 WatchSource:0}: Error finding container 3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07: Status 404 returned error can't find the container with id 3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07 Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987347 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" exitCode=0 Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerDied","Data":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerDied","Data":"0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82"} Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987448 4755 scope.go:117] "RemoveContainer" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987569 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.991816 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07"} Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.016433 4755 scope.go:117] "RemoveContainer" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:05 crc kubenswrapper[4755]: E0320 13:53:05.020136 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": container with ID starting with 1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda not found: ID does not exist" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.020195 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} err="failed to get container status \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": rpc error: code = NotFound desc = could not find container \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": container with ID starting with 1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda not found: ID does not exist" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.050393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.051437 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.051643 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.058368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92" (OuterVolumeSpecName: "kube-api-access-f5r92") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "kube-api-access-f5r92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.124886 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.125039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data" (OuterVolumeSpecName: "config-data") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156495 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156524 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156537 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.236348 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" path="/var/lib/kubelet/pods/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7/volumes" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.428773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.454858 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.474203 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: E0320 13:53:05.474840 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.474864 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.475142 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.475925 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.479713 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.487110 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.566569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.567111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.567151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.680641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.683940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.686219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.804003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.008719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"946bc1cbbf0bb5b4a2df740e13ae2bd9f5f09a66f23558abd1908cbc75084b24"} Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.008768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"bc30af876b4ad2f08935a647d697807fdf037fa904c623faed14f2c09838428c"} Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.030067 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.030045725 podStartE2EDuration="3.030045725s" podCreationTimestamp="2026-03-20 13:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:06.028595498 +0000 UTC m=+1365.626528027" watchObservedRunningTime="2026-03-20 13:53:06.030045725 +0000 UTC m=+1365.627978254" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.242004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:06 crc kubenswrapper[4755]: W0320 13:53:06.250048 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15175c96_bbe4_4a56_be68_a5db33909e54.slice/crio-c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6 WatchSource:0}: Error finding container c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6: Status 404 returned error can't find the container with id c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6 Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.751737 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752108 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752934 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752992 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" gracePeriod=600 Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.026882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15175c96-bbe4-4a56-be68-a5db33909e54","Type":"ContainerStarted","Data":"5a5a909caf71fd20e500db1f49d69a478f176d61164c67eb5c07dc619b5f6be3"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.026947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15175c96-bbe4-4a56-be68-a5db33909e54","Type":"ContainerStarted","Data":"c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030339 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" exitCode=0 Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030445 4755 scope.go:117] "RemoveContainer" containerID="4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.056167 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.056150861 podStartE2EDuration="2.056150861s" podCreationTimestamp="2026-03-20 13:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:07.049097377 +0000 UTC m=+1366.647029906" watchObservedRunningTime="2026-03-20 13:53:07.056150861 +0000 UTC m=+1366.654083390" Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.238703 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" path="/var/lib/kubelet/pods/9b22e0c3-341e-444d-a615-50d5ccdc9f12/volumes" Mar 20 13:53:08 crc kubenswrapper[4755]: I0320 13:53:08.044533 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.648323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.649136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.804323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:53:11 crc kubenswrapper[4755]: I0320 13:53:11.661891 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b975ad31-5e47-43b2-a0c6-4d1ee9e50006" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:11 crc kubenswrapper[4755]: I0320 13:53:11.662167 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b975ad31-5e47-43b2-a0c6-4d1ee9e50006" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:14 crc kubenswrapper[4755]: I0320 13:53:14.377639 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:53:14 crc kubenswrapper[4755]: I0320 13:53:14.379495 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.423820 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80d8fd29-d89f-4955-86f3-a8137400c67b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.423909 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80d8fd29-d89f-4955-86f3-a8137400c67b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.804726 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.840222 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:53:16 crc kubenswrapper[4755]: I0320 13:53:16.180334 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:53:17 crc kubenswrapper[4755]: I0320 13:53:17.342020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:53:18 crc kubenswrapper[4755]: I0320 13:53:18.647903 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:53:18 crc kubenswrapper[4755]: I0320 13:53:18.648286 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.659015 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.661142 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.671269 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:53:21 crc kubenswrapper[4755]: I0320 13:53:21.208544 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:53:22 crc kubenswrapper[4755]: I0320 13:53:22.380173 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:53:22 crc kubenswrapper[4755]: I0320 13:53:22.381374 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.387793 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.389603 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.397066 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:53:25 crc kubenswrapper[4755]: I0320 13:53:25.269346 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:53:35 crc kubenswrapper[4755]: I0320 13:53:35.368453 4755 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9b22e0c3-341e-444d-a615-50d5ccdc9f12"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9b22e0c3-341e-444d-a615-50d5ccdc9f12] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9b22e0c3_341e_444d_a615_50d5ccdc9f12.slice" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.165698 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.168123 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.170398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.171703 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.171927 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.198296 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.229996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.333743 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.384399 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.511610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:01 crc kubenswrapper[4755]: I0320 13:54:01.025343 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:01 crc kubenswrapper[4755]: I0320 13:54:01.681307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerStarted","Data":"3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c"} Mar 20 13:54:02 crc kubenswrapper[4755]: I0320 13:54:02.694778 4755 generic.go:334] "Generic (PLEG): container finished" podID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerID="e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c" exitCode=0 Mar 20 13:54:02 crc kubenswrapper[4755]: I0320 13:54:02.694861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerDied","Data":"e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c"} Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.113673 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.227763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.239811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq" (OuterVolumeSpecName: "kube-api-access-8z9vq") pod "ea7c11fe-b29d-4fa4-a46d-7079105e883e" (UID: "ea7c11fe-b29d-4fa4-a46d-7079105e883e"). InnerVolumeSpecName "kube-api-access-8z9vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.330642 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerDied","Data":"3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c"} Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717468 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717480 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.911070 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:04 crc kubenswrapper[4755]: E0320 13:54:04.911918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.911942 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.912156 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.913322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.915621 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6z9hz"/"openshift-service-ca.crt" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.919645 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6z9hz"/"kube-root-ca.crt" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.919694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6z9hz"/"default-dockercfg-4thtd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943217 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943568 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049169 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.103522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.218511 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.234921 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.249014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.734752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:05 crc kubenswrapper[4755]: W0320 13:54:05.737135 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e7e4d4d_749a_4ec8_89f4_1362f7787e43.slice/crio-e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2 WatchSource:0}: Error finding container e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2: Status 404 returned error can't find the container with id e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2 Mar 20 13:54:06 crc kubenswrapper[4755]: I0320 13:54:06.747531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2"} Mar 20 13:54:07 crc kubenswrapper[4755]: I0320 13:54:07.238794 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" path="/var/lib/kubelet/pods/a434c164-9ea6-4062-b8f6-88bb58f41a64/volumes" Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.788029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4"} Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.788704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.806219 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" podStartSLOduration=2.625588198 podStartE2EDuration="6.806206114s" podCreationTimestamp="2026-03-20 13:54:04 +0000 UTC" firstStartedPulling="2026-03-20 13:54:05.738759487 +0000 UTC m=+1425.336692016" lastFinishedPulling="2026-03-20 13:54:09.919377393 +0000 UTC m=+1429.517309932" observedRunningTime="2026-03-20 13:54:10.801200172 +0000 UTC m=+1430.399132701" watchObservedRunningTime="2026-03-20 13:54:10.806206114 +0000 UTC m=+1430.404138643" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.236440 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.238152 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.365587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.365703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.466876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.467243 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.467445 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.492262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.560119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.855173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerStarted","Data":"70470e286c3fd7782bb98a9a790ceaee1c9055dfab3df7a5943ee2120f5a6b70"} Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.639996 4755 scope.go:117] "RemoveContainer" containerID="52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7" Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.967229 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerStarted","Data":"12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2"} Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.984125 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" podStartSLOduration=1.628718084 podStartE2EDuration="12.98410567s" podCreationTimestamp="2026-03-20 13:54:15 +0000 UTC" firstStartedPulling="2026-03-20 13:54:15.607665691 +0000 UTC m=+1435.205598220" lastFinishedPulling="2026-03-20 13:54:26.963053277 +0000 UTC m=+1446.560985806" observedRunningTime="2026-03-20 13:54:27.981332137 +0000 UTC m=+1447.579264706" watchObservedRunningTime="2026-03-20 13:54:27.98410567 +0000 UTC m=+1447.582038209" Mar 20 13:54:46 crc kubenswrapper[4755]: I0320 13:54:46.140061 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerID="12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2" exitCode=0 Mar 20 13:54:46 crc kubenswrapper[4755]: I0320 13:54:46.140143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerDied","Data":"12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2"} Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.257282 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.312585 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.330759 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"1b539e13-5082-42e6-ac4d-a3f8fa788244\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"1b539e13-5082-42e6-ac4d-a3f8fa788244\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379204 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host" (OuterVolumeSpecName: "host") pod "1b539e13-5082-42e6-ac4d-a3f8fa788244" (UID: "1b539e13-5082-42e6-ac4d-a3f8fa788244"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379703 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.385404 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn" (OuterVolumeSpecName: "kube-api-access-6ckgn") pod "1b539e13-5082-42e6-ac4d-a3f8fa788244" (UID: "1b539e13-5082-42e6-ac4d-a3f8fa788244"). InnerVolumeSpecName "kube-api-access-6ckgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.480943 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.160226 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70470e286c3fd7782bb98a9a790ceaee1c9055dfab3df7a5943ee2120f5a6b70" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.160344 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.494471 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:48 crc kubenswrapper[4755]: E0320 13:54:48.494838 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.494850 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.495045 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.495592 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.600336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.600751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.733750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.812416 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: W0320 13:54:48.851759 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38c8e_0ed7_4de9_a7ef_b95a385aee6e.slice/crio-5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325 WatchSource:0}: Error finding container 5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325: Status 404 returned error can't find the container with id 5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325 Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.169678 4755 generic.go:334] "Generic (PLEG): container finished" podID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerID="6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3" exitCode=1 Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.169772 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" event={"ID":"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e","Type":"ContainerDied","Data":"6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3"} Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.170050 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" event={"ID":"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e","Type":"ContainerStarted","Data":"5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325"} Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.214258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.222306 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.236587 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" path="/var/lib/kubelet/pods/1b539e13-5082-42e6-ac4d-a3f8fa788244/volumes" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.283404 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.331815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.331879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host" (OuterVolumeSpecName: "host") pod "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" (UID: "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.332256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.332744 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.347347 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw" (OuterVolumeSpecName: "kube-api-access-889lw") pod "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" (UID: "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e"). InnerVolumeSpecName "kube-api-access-889lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.434056 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.190574 4755 scope.go:117] "RemoveContainer" containerID="6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.190708 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.241371 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" path="/var/lib/kubelet/pods/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e/volumes" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.760573 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:08 crc kubenswrapper[4755]: E0320 13:55:08.761800 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.761817 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.762064 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.763864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.763969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.808939 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.809016 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.809306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910871 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.911413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.911483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.942185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:09 crc kubenswrapper[4755]: I0320 13:55:09.091577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:09 crc kubenswrapper[4755]: I0320 13:55:09.577910 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.389722 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" exitCode=0 Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.389870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10"} Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.390178 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"52b02238107f8ca4e75a6ad385761dee4f2cd32405f08f6c4064824ba74ac65f"} Mar 20 13:55:13 crc kubenswrapper[4755]: I0320 13:55:13.423509 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} Mar 20 13:55:14 crc kubenswrapper[4755]: I0320 13:55:14.441316 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" exitCode=0 Mar 20 13:55:14 crc kubenswrapper[4755]: I0320 13:55:14.441512 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} Mar 20 13:55:15 crc kubenswrapper[4755]: I0320 13:55:15.466761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} Mar 20 13:55:15 crc kubenswrapper[4755]: I0320 13:55:15.493090 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6d2r" podStartSLOduration=2.783330028 podStartE2EDuration="7.493052688s" podCreationTimestamp="2026-03-20 13:55:08 +0000 UTC" firstStartedPulling="2026-03-20 13:55:10.391854328 +0000 UTC m=+1489.989786857" lastFinishedPulling="2026-03-20 13:55:15.101576988 +0000 UTC m=+1494.699509517" observedRunningTime="2026-03-20 13:55:15.48632175 +0000 UTC m=+1495.084254279" watchObservedRunningTime="2026-03-20 13:55:15.493052688 +0000 UTC m=+1495.090985267" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.091972 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.093261 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.452885 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-1376-account-create-update-jhbhp_015c8ae7-1856-4b0c-b5ce-e2503a2080dc/mariadb-account-create-update/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.568070 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7769db74db-f4kfh_45fa2a85-b7d9-413c-827c-fdcbcec05faf/barbican-api/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.653767 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7769db74db-f4kfh_45fa2a85-b7d9-413c-827c-fdcbcec05faf/barbican-api-log/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.859412 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-jm9nr_feb55e83-711d-4561-8b57-2a231944e1b1/mariadb-database-create/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.119439 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-dtggj_95c76f8c-7b76-4714-adac-6297b84d6492/barbican-db-sync/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.151282 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:20 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:20 crc kubenswrapper[4755]: > Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.274109 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-cbc45f8f6-z2sx8_55a78d73-f853-49d7-99b2-81c25ea6bb20/barbican-keystone-listener/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.373507 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-cbc45f8f6-z2sx8_55a78d73-f853-49d7-99b2-81c25ea6bb20/barbican-keystone-listener-log/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.428449 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b9dc5449-j62ns_d2108220-35b4-45b7-a2bc-e93138394ff0/barbican-worker/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.528377 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b9dc5449-j62ns_d2108220-35b4-45b7-a2bc-e93138394ff0/barbican-worker-log/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.735893 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/ceilometer-central-agent/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.744527 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/ceilometer-notification-agent/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.755499 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/proxy-httpd/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.923095 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/sg-core/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.030955 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-9528-account-create-update-6xkmx_5dde547e-5fce-4868-ba0e-63650ea0c771/mariadb-account-create-update/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.103802 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9894c7cb-7899-4354-a6c2-e7339eb1f765/cinder-api/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.195672 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9894c7cb-7899-4354-a6c2-e7339eb1f765/cinder-api-log/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.302361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-2jwbt_8c5d05dc-a589-4d2e-9374-0d57202a3cfc/mariadb-database-create/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.384152 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-jrf8c_25bd1da4-7fdb-4bd9-8405-a37fc6c18be0/cinder-db-sync/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.519860 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df39e954-98b1-4c7c-bc51-5c2ee4db8a6d/cinder-scheduler/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.549620 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df39e954-98b1-4c7c-bc51-5c2ee4db8a6d/probe/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.694325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/init/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.833379 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/dnsmasq-dns/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.869197 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/init/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.924176 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-35fe-account-create-update-h6fl8_46d041c2-e231-49fd-9d88-a991a1b9dd65/mariadb-account-create-update/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.074296 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-pg2bq_6fe77db3-29ef-42ae-840b-9736f07188ca/mariadb-database-create/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.144247 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-w78rr_3047e6fe-5128-4361-bede-e9f0c4e9387c/glance-db-sync/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.327207 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e65d1645-8a19-459e-ac89-b485f27e2841/glance-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.343908 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e65d1645-8a19-459e-ac89-b485f27e2841/glance-httpd/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.533761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6b182ae3-20c9-48af-9313-d48a608924b1/glance-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.550639 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6b182ae3-20c9-48af-9313-d48a608924b1/glance-httpd/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.673690 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9f7d4c74d-t7tpq_2af5836e-8c76-4432-95c0-ef34d6fc3528/horizon/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.796064 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9f7d4c74d-t7tpq_2af5836e-8c76-4432-95c0-ef34d6fc3528/horizon-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.875992 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8e2f-account-create-update-cvvh2_79c00857-0d6a-4c12-8581-da16e2a24f04/mariadb-account-create-update/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.036104 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8f554bbf4-zvxzv_ab9d92e7-deba-4bdd-a267-e35fd5ec2f23/keystone-api/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.123530 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-rwsvb_5dddb768-c318-44b8-bac9-ea26f29ca038/keystone-bootstrap/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.213496 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-g2hvs_0795b626-b382-4b9b-beb5-802cebc4f764/mariadb-database-create/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.287028 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-9xrbx_64ad8e64-0606-4171-bd2d-ae8212fdff8f/keystone-db-sync/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.353282 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4f27a8a2-0755-47ae-a7b4-63787c8c9393/kube-state-metrics/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.567164 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754b98cbff-jgntp_0263cee7-e9d5-48ff-8326-7455a95311a6/neutron-httpd/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.630205 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754b98cbff-jgntp_0263cee7-e9d5-48ff-8326-7455a95311a6/neutron-api/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.719722 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-hm9qz_e38d31ac-eae6-4cd1-be04-304215db852a/mariadb-database-create/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.874186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-52m67_69707be4-e338-4e13-8ecc-8cfd7cd416b2/neutron-db-sync/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.943533 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc04-account-create-update-x9t57_34c85756-25cf-4302-bd5d-72f2e459f562/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.173089 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b975ad31-5e47-43b2-a0c6-4d1ee9e50006/nova-api-log/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.226114 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b975ad31-5e47-43b2-a0c6-4d1ee9e50006/nova-api-api/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.258965 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-c99c-account-create-update-5s889_03accbff-bdf2-4256-bdf2-1b39d5485673/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.372522 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-9jv87_0deb3f1a-0cad-4429-9e79-38e5a0b38896/mariadb-database-create/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.486213 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-4e76-account-create-update-vjcr6_523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.583276 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-vz8fw_2ff73477-b65b-4362-938c-94b1bb1f51b0/nova-manage/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.008892 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_676b01c6-a64d-4530-b157-10160afd719a/nova-cell0-conductor-conductor/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.016019 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-mbd9g_faef786e-b221-4fff-8d48-42b8163ed86a/nova-cell0-conductor-db-sync/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.131511 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-79jc8_32a5606c-c777-4c0b-951c-6ce2e03edd7e/mariadb-database-create/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.228422 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-7fs4m_557c5385-782c-410a-a371-b27f41d88a47/nova-manage/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.464926 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_32aa4c4f-3c67-46f5-90ae-59d17077eb1d/nova-cell1-conductor-conductor/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.492820 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-qbtvj_cadbdc7c-ed66-43d7-82ee-d797beb959a8/nova-cell1-conductor-db-sync/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.673815 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-jqk4f_f395acec-f28b-4622-b349-127cf31ec92d/mariadb-database-create/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.679767 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-ee84-account-create-update-jpmvf_39991203-9b8d-4985-8e90-b3d1772f6b8f/mariadb-account-create-update/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.931460 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8318edf5-5648-4c19-8853-3d555435ed6f/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.041012 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80d8fd29-d89f-4955-86f3-a8137400c67b/nova-metadata-log/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.196637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80d8fd29-d89f-4955-86f3-a8137400c67b/nova-metadata-metadata/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.283009 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_15175c96-bbe4-4a56-be68-a5db33909e54/nova-scheduler-scheduler/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.357477 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.563913 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.616037 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.625262 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/galera/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.760144 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.826462 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/galera/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.866824 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_96136572-ead6-4771-bd36-eec29b5fb137/openstackclient/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.028490 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kbcdp_408d869f-0966-4908-88e5-37cdff345c4a/ovn-controller/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.106198 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nmwms_3a0e99e7-7429-41a7-bff7-23cafba6b78a/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.229353 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server-init/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.461761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovs-vswitchd/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.501386 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server-init/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.538909 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.679631 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1bdd912-fe33-4449-aed8-12a5ee09961e/ovn-northd/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.719560 4755 scope.go:117] "RemoveContainer" containerID="d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.722637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1bdd912-fe33-4449-aed8-12a5ee09961e/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.743972 4755 scope.go:117] "RemoveContainer" containerID="cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.795865 4755 scope.go:117] "RemoveContainer" containerID="bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.898608 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de877bb8-b1cd-45de-94c1-5242659fd03e/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.910453 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de877bb8-b1cd-45de-94c1-5242659fd03e/ovsdbserver-nb/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.113380 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fed1ecda-4acb-4a4c-a84e-12e58b3ad243/openstack-network-exporter/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.118569 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fed1ecda-4acb-4a4c-a84e-12e58b3ad243/ovsdbserver-sb/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.259327 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65884d74bb-n9mkw_0187d784-0bbe-4f5f-9b84-ee240bb90970/placement-api/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.316390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65884d74bb-n9mkw_0187d784-0bbe-4f5f-9b84-ee240bb90970/placement-log/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.384695 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9157-account-create-update-q8r48_0587eb58-cd5e-4e0b-be30-97e0a569fc57/mariadb-account-create-update/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.516946 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-ng8vm_2af42784-d5cc-4f7c-832a-f91dbd54cc3f/mariadb-database-create/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.588129 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-cxr9p_7ea35a84-68ca-4490-b1d9-fa999ef63ebe/placement-db-sync/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.712999 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/setup-container/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.898648 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.006644 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.065246 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/rabbitmq/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.189331 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.218886 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/rabbitmq/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.271214 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-jvtvk_8ae45e95-b96a-4157-a584-a6eb321d5091/mariadb-account-create-update/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.444386 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847679bbfc-l8kwj_12a81787-83e5-4552-85e6-19733309756d/proxy-server/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.539735 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847679bbfc-l8kwj_12a81787-83e5-4552-85e6-19733309756d/proxy-httpd/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.576453 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j55xs_c13f5042-e5e5-47a3-bc96-b504a0bf9af2/swift-ring-rebalance/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.734698 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-auditor/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.773982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-reaper/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.779136 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-replicator/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.910129 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-server/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.961035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-auditor/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.992950 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-replicator/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.036298 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-server/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.119583 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-updater/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.138856 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:30 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:30 crc kubenswrapper[4755]: > Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.223138 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-expirer/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.226399 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-auditor/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.269449 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-replicator/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.315124 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-server/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.405056 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-updater/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.439757 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/rsync/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.487679 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/swift-recon-cron/0.log" Mar 20 13:55:31 crc kubenswrapper[4755]: I0320 13:55:31.582957 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1786d302-95f2-410e-8280-14a89cbaf48c/memcached/0.log" Mar 20 13:55:36 crc kubenswrapper[4755]: I0320 13:55:36.750977 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:55:36 crc kubenswrapper[4755]: I0320 13:55:36.751630 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:55:40 crc kubenswrapper[4755]: I0320 13:55:40.140195 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:40 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:40 crc kubenswrapper[4755]: > Mar 20 13:55:50 crc kubenswrapper[4755]: I0320 13:55:50.146312 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:50 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:50 crc kubenswrapper[4755]: > Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.376594 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.594971 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.647192 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.659783 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.836097 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.846205 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.899119 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/extract/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.096602 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-ds4tb_3a22a8d8-92cd-4177-a597-9c659673392c/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.434246 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-xd8mk_00fc80a4-4ea8-4f61-8795-6473f0adc40a/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.551514 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-cbj27_bc80030a-428b-4643-9d8d-2b0e9c873060/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.636067 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-92fwj_552e0390-e86e-4972-bf6f-a4570e6b6f81/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.831201 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-hxmnd_52210224-8989-4e16-8fdf-4ea3a8211b10/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.902809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-pr9d5_4c1ba89a-aed6-4245-8411-4d1fecac2500/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.162414 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-f2nbs_21c9358d-2c84-4c38-9c91-8ca3dad4dab7/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.170331 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b55fff5bb-sm4wg_83d6120d-b54b-452c-aa8a-026665f1afae/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.282254 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-4x5nd_5a83ca27-3334-4aac-9129-5635d3af0714/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.365839 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gszjd_7f51051e-6a90-4582-a411-28a106c37118/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.502362 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-2d9hb_fe4ddc70-f382-4b32-8879-122023b45438/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.664168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-c8crg_1aaef0d5-16fe-4c61-82d5-660f29168171/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.789316 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-r8jn7_358d4809-db3b-4468-8c8c-4ffbedc0ec89/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.861584 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-c5nbs_b3c037b9-79d2-45ea-9b92-66e50eb20e6b/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.985790 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f58nx7v_bad91c65-94da-4f8a-addb-21b037197217/manager/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.403809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6c645d7445-cbmxt_e837c2d9-26ab-47a1-b48a-44f28fc2e2a6/operator/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.691477 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-98hpf_b58ff15e-f098-460d-ada4-3bdd990125ba/registry-server/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.804342 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-2xgt8_72ea5d65-7221-4b25-9025-7a5c31bae331/manager/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.965687 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-j7qf2_a1b32bae-fa65-45aa-a8db-b46a7351ee2c/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.028287 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-795d5ff795-ld7m6_42c9c167-c386-4d60-868c-8b0b63fccbcd/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.089824 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-bfh6x_14b4b9ba-026c-4fd7-a57d-545e62b6981e/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.150084 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.201209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.280948 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-hpmzq_bc93761d-ecc1-4179-8287-40fd76ba5ad1/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.343679 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-4khh5_26dfba7a-f5fa-45bc-a187-91ddce4da2d6/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.386424 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.511944 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-z9px7_88862bd4-c890-447c-b4ee-b9cb1a4928e8/manager/0.log" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.140577 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.141771 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146247 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146321 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.153429 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.295965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.397390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.419364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.507879 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.861406 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" containerID="cri-o://e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" gracePeriod=2 Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.986991 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.012696 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.270310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315785 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.318762 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities" (OuterVolumeSpecName: "utilities") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.333260 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq" (OuterVolumeSpecName: "kube-api-access-rw4gq") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "kube-api-access-rw4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.417871 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.417906 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.448306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.519749 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.870572 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerStarted","Data":"12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873784 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" exitCode=0 Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873845 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"52b02238107f8ca4e75a6ad385761dee4f2cd32405f08f6c4064824ba74ac65f"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873898 4755 scope.go:117] "RemoveContainer" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.874038 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.908837 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.931146 4755 scope.go:117] "RemoveContainer" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.931532 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.971545 4755 scope.go:117] "RemoveContainer" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.004837 4755 scope.go:117] "RemoveContainer" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.006641 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": container with ID starting with e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a not found: ID does not exist" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.006696 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} err="failed to get container status \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": rpc error: code = NotFound desc = could not find container \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": container with ID starting with e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.006722 4755 scope.go:117] "RemoveContainer" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.008136 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": container with ID starting with 3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00 not found: ID does not exist" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008188 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} err="failed to get container status \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": rpc error: code = NotFound desc = could not find container \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": container with ID starting with 3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00 not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008218 4755 scope.go:117] "RemoveContainer" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.008548 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": container with ID starting with 9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10 not found: ID does not exist" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008589 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10"} err="failed to get container status \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": rpc error: code = NotFound desc = could not find container \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": container with ID starting with 9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10 not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.885074 4755 generic.go:334] "Generic (PLEG): container finished" podID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerID="ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8" exitCode=0 Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.885164 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerDied","Data":"ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8"} Mar 20 13:56:03 crc kubenswrapper[4755]: I0320 13:56:03.235752 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" path="/var/lib/kubelet/pods/2363273e-8f78-4383-a21b-23f0d8a234b4/volumes" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.267127 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.374932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.383015 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2" (OuterVolumeSpecName: "kube-api-access-6mqs2") pod "c74d4c86-05c3-4ac3-a18e-cb75b4d95559" (UID: "c74d4c86-05c3-4ac3-a18e-cb75b4d95559"). InnerVolumeSpecName "kube-api-access-6mqs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.478686 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919422 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerDied","Data":"12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0"} Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919832 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919500 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:05 crc kubenswrapper[4755]: I0320 13:56:05.336382 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:56:05 crc kubenswrapper[4755]: I0320 13:56:05.344684 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:56:06 crc kubenswrapper[4755]: I0320 13:56:06.751739 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:06 crc kubenswrapper[4755]: I0320 13:56:06.752144 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:07 crc kubenswrapper[4755]: I0320 13:56:07.237591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" path="/var/lib/kubelet/pods/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f/volumes" Mar 20 13:56:20 crc kubenswrapper[4755]: I0320 13:56:20.911549 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2bm86_85fb2982-9af0-4450-80f4-12fbd6e7a590/control-plane-machine-set-operator/0.log" Mar 20 13:56:21 crc kubenswrapper[4755]: I0320 13:56:21.103818 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4zdx6_1fdd6691-9136-43ba-abea-7ba6862e9681/kube-rbac-proxy/0.log" Mar 20 13:56:21 crc kubenswrapper[4755]: I0320 13:56:21.172781 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4zdx6_1fdd6691-9136-43ba-abea-7ba6862e9681/machine-api-operator/0.log" Mar 20 13:56:27 crc kubenswrapper[4755]: I0320 13:56:27.982438 4755 scope.go:117] "RemoveContainer" containerID="5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418" Mar 20 13:56:28 crc kubenswrapper[4755]: I0320 13:56:28.039149 4755 scope.go:117] "RemoveContainer" containerID="bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.351907 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7gpgn_cdf5c938-39f0-46a4-bce6-1a0cf67624ab/cert-manager-controller/0.log" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.423197 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-l955j_a3125fba-bed9-40d3-b53d-f976488e12d2/cert-manager-cainjector/0.log" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.535575 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hbz2p_f3b802e1-c690-4817-91cf-d721cbfae51c/cert-manager-webhook/0.log" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.637524 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.637996 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638014 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638056 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638064 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638082 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-content" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638091 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-content" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638103 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-utilities" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638110 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-utilities" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638337 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638357 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.640064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.662423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725642 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751313 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751366 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751402 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751999 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.752054 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" gracePeriod=600 Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.851279 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.957918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211357 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" exitCode=0 Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211781 4755 scope.go:117] "RemoveContainer" containerID="0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.541833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:37 crc kubenswrapper[4755]: W0320 13:56:37.546071 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda94c1d23_46c0_439e_8d37_f3bbfeed4646.slice/crio-8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041 WatchSource:0}: Error finding container 8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041: Status 404 returned error can't find the container with id 8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041 Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.227928 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" exitCode=0 Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.227982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50"} Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.228361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041"} Mar 20 13:56:39 crc kubenswrapper[4755]: I0320 13:56:39.242443 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} Mar 20 13:56:40 crc kubenswrapper[4755]: I0320 13:56:40.254838 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" exitCode=0 Mar 20 13:56:40 crc kubenswrapper[4755]: I0320 13:56:40.254887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} Mar 20 13:56:41 crc kubenswrapper[4755]: I0320 13:56:41.267182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} Mar 20 13:56:41 crc kubenswrapper[4755]: I0320 13:56:41.299349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5fpr" podStartSLOduration=2.822440043 podStartE2EDuration="5.299325827s" podCreationTimestamp="2026-03-20 13:56:36 +0000 UTC" firstStartedPulling="2026-03-20 13:56:38.230534338 +0000 UTC m=+1577.828466907" lastFinishedPulling="2026-03-20 13:56:40.707420162 +0000 UTC m=+1580.305352691" observedRunningTime="2026-03-20 13:56:41.28810232 +0000 UTC m=+1580.886034849" watchObservedRunningTime="2026-03-20 13:56:41.299325827 +0000 UTC m=+1580.897258366" Mar 20 13:56:46 crc kubenswrapper[4755]: I0320 13:56:46.958724 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:46 crc kubenswrapper[4755]: I0320 13:56:46.959476 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.015156 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.409531 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.474065 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.351519 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2hkvk_a9993046-1fc7-4faa-a634-f91339d94c71/nmstate-console-plugin/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.353833 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m5fpr" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" containerID="cri-o://706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" gracePeriod=2 Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.530648 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dspfd_2e4b8ce9-115c-4c39-9f1b-a5681ded9b68/nmstate-handler/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.589034 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-68g6g_284e4beb-7815-41fc-ac59-95ed647c0d7c/kube-rbac-proxy/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.664148 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-68g6g_284e4beb-7815-41fc-ac59-95ed647c0d7c/nmstate-metrics/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.761366 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-rz567_93adf7be-d696-48e2-b6d5-af27b19b24e3/nmstate-operator/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.832412 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.863390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-72787_36f8cd57-a5ee-4a30-b7b6-8f13d698861c/nmstate-webhook/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882594 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882640 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882718 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.883733 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities" (OuterVolumeSpecName: "utilities") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.889883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9" (OuterVolumeSpecName: "kube-api-access-klpq9") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "kube-api-access-klpq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.985471 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.985509 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364115 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" exitCode=0 Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364160 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364180 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041"} Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364222 4755 scope.go:117] "RemoveContainer" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.396299 4755 scope.go:117] "RemoveContainer" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.422802 4755 scope.go:117] "RemoveContainer" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.467989 4755 scope.go:117] "RemoveContainer" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.468397 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": container with ID starting with 706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52 not found: ID does not exist" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468450 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} err="failed to get container status \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": rpc error: code = NotFound desc = could not find container \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": container with ID starting with 706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52 not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468495 4755 scope.go:117] "RemoveContainer" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.468868 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": container with ID starting with a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b not found: ID does not exist" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468903 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} err="failed to get container status \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": rpc error: code = NotFound desc = could not find container \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": container with ID starting with a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468926 4755 scope.go:117] "RemoveContainer" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.469334 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": container with ID starting with c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50 not found: ID does not exist" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.469387 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50"} err="failed to get container status \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": rpc error: code = NotFound desc = could not find container \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": container with ID starting with c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50 not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.532331 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.598730 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.697167 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.710165 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:51 crc kubenswrapper[4755]: I0320 13:56:51.241939 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" path="/var/lib/kubelet/pods/a94c1d23-46c0-439e-8d37-f3bbfeed4646/volumes" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.844773 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845721 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-utilities" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845734 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-utilities" Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845757 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845763 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845794 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-content" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845800 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-content" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845959 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.847182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.866689 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895437 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895591 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.997842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998501 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998601 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.017017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.175119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.663463 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:08 crc kubenswrapper[4755]: E0320 13:57:08.090014 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a373048_a6fb_43f3_86bf_cc41057c8ecd.slice/crio-conmon-5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a373048_a6fb_43f3_86bf_cc41057c8ecd.slice/crio-5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539756 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" exitCode=0 Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40"} Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539828 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"30a8bd4f907483e441f264aeaf06d49007454d93dd8d2113c45351adc85d9f47"} Mar 20 13:57:09 crc kubenswrapper[4755]: I0320 13:57:09.550856 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} Mar 20 13:57:10 crc kubenswrapper[4755]: I0320 13:57:10.562898 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" exitCode=0 Mar 20 13:57:10 crc kubenswrapper[4755]: I0320 13:57:10.563817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} Mar 20 13:57:11 crc kubenswrapper[4755]: I0320 13:57:11.575354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} Mar 20 13:57:11 crc kubenswrapper[4755]: I0320 13:57:11.595826 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4wl28" podStartSLOduration=3.146818998 podStartE2EDuration="5.595808274s" podCreationTimestamp="2026-03-20 13:57:06 +0000 UTC" firstStartedPulling="2026-03-20 13:57:08.541491128 +0000 UTC m=+1608.139423667" lastFinishedPulling="2026-03-20 13:57:10.990480414 +0000 UTC m=+1610.588412943" observedRunningTime="2026-03-20 13:57:11.590453263 +0000 UTC m=+1611.188385802" watchObservedRunningTime="2026-03-20 13:57:11.595808274 +0000 UTC m=+1611.193740803" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.175877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.176622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.239753 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.701682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.748907 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.277535 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qsbbn_a71f1548-62b5-4a77-9655-735bafa396c8/kube-rbac-proxy/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.401309 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qsbbn_a71f1548-62b5-4a77-9655-735bafa396c8/controller/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.464744 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.689594 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.689633 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.706641 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.707861 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.875175 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.893465 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.909001 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.913599 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.044295 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.061821 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.068956 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.093762 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/controller/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.228471 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/frr-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.284035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/kube-rbac-proxy/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.310324 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/kube-rbac-proxy-frr/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.437445 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/reloader/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.526878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-7xgrp_490ee5e7-c0b1-4181-b7ac-86e5e61253a0/frr-k8s-webhook-server/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.671360 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4wl28" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" containerID="cri-o://85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" gracePeriod=2 Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.767392 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6ddbc48b88-k4d8p_32289872-a679-4d10-8b2f-0519c713dc35/manager/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.938213 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-588c694cdc-8vjlb_f0274fca-6425-402c-a2aa-853b232ad93c/webhook-server/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.016015 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6vf4n_839a8db3-662c-41c4-bb63-6b1027901ab5/kube-rbac-proxy/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.097160 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/frr/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.284026 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.475896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.476170 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.476203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.477739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities" (OuterVolumeSpecName: "utilities") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.484150 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4" (OuterVolumeSpecName: "kube-api-access-4w7h4") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "kube-api-access-4w7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.491163 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6vf4n_839a8db3-662c-41c4-bb63-6b1027901ab5/speaker/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.506244 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578686 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578724 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578733 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682033 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" exitCode=0 Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"30a8bd4f907483e441f264aeaf06d49007454d93dd8d2113c45351adc85d9f47"} Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682307 4755 scope.go:117] "RemoveContainer" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682307 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.702221 4755 scope.go:117] "RemoveContainer" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.713162 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.722108 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.734897 4755 scope.go:117] "RemoveContainer" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768400 4755 scope.go:117] "RemoveContainer" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.768865 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": container with ID starting with 85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d not found: ID does not exist" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768909 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} err="failed to get container status \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": rpc error: code = NotFound desc = could not find container \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": container with ID starting with 85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d not found: ID does not exist" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768938 4755 scope.go:117] "RemoveContainer" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.769234 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": container with ID starting with 212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a not found: ID does not exist" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769265 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} err="failed to get container status \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": rpc error: code = NotFound desc = could not find container \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": container with ID starting with 212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a not found: ID does not exist" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769287 4755 scope.go:117] "RemoveContainer" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.769708 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": container with ID starting with 5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40 not found: ID does not exist" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769734 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40"} err="failed to get container status \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": rpc error: code = NotFound desc = could not find container \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": container with ID starting with 5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40 not found: ID does not exist" Mar 20 13:57:21 crc kubenswrapper[4755]: I0320 13:57:21.237945 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" path="/var/lib/kubelet/pods/5a373048-a6fb-43f3-86bf-cc41057c8ecd/volumes" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.678045 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.811285 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.909988 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.919566 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.049898 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.051186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.133070 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/extract/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.236162 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.402064 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.421153 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.422087 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.596450 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.602526 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/extract/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.605086 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.765435 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.925474 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.937339 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.948057 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.132071 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.151273 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.384502 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.459434 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.470276 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/registry-server/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.480638 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.569359 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.728170 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.731888 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.956585 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ngw4b_6d1fc18c-b364-439b-926f-12fe310d0917/marketplace-operator/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.023540 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.063463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/registry-server/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.161442 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.204953 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.211616 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.412274 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.430035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.479277 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/registry-server/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.609118 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.790596 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.792375 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.816843 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.953636 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:37 crc kubenswrapper[4755]: I0320 13:57:37.047770 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:37 crc kubenswrapper[4755]: I0320 13:57:37.214047 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/registry-server/0.log" Mar 20 13:57:53 crc kubenswrapper[4755]: E0320 13:57:53.495332 4755 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.181:39742->38.102.83.181:38787: write tcp 38.102.83.181:39742->38.102.83.181:38787: write: broken pipe Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.174156 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175493 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175506 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175531 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175537 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175547 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175553 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175844 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.176533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.179560 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.179834 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.180006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.192833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.208866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.311928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.333423 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.507775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:01 crc kubenswrapper[4755]: I0320 13:58:01.026030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:01 crc kubenswrapper[4755]: I0320 13:58:01.088256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerStarted","Data":"600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7"} Mar 20 13:58:03 crc kubenswrapper[4755]: I0320 13:58:03.118297 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerID="6b93c4f221413ce21758a54cb81f9dd1307ec1714e9b3709e5e630c41008370d" exitCode=0 Mar 20 13:58:03 crc kubenswrapper[4755]: I0320 13:58:03.118424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerDied","Data":"6b93c4f221413ce21758a54cb81f9dd1307ec1714e9b3709e5e630c41008370d"} Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.555134 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.599901 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.609710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj" (OuterVolumeSpecName: "kube-api-access-bhqhj") pod "1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" (UID: "1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6"). InnerVolumeSpecName "kube-api-access-bhqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.701707 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.136975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerDied","Data":"600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7"} Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.137402 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.137464 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.632688 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.645731 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:58:07 crc kubenswrapper[4755]: I0320 13:58:07.235700 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" path="/var/lib/kubelet/pods/78cf774b-eb80-4f5b-a7de-2012636d36c5/volumes" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.190582 4755 scope.go:117] "RemoveContainer" containerID="72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.250917 4755 scope.go:117] "RemoveContainer" containerID="b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.299431 4755 scope.go:117] "RemoveContainer" containerID="e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.329416 4755 scope.go:117] "RemoveContainer" containerID="56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" Mar 20 13:58:41 crc kubenswrapper[4755]: I0320 13:58:41.630839 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-srzwn" podUID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerName="registry-server" probeResult="failure" output=< Mar 20 13:58:41 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:58:41 crc kubenswrapper[4755]: > Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.067306 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.094161 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.105198 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.116897 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.126371 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.133196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.147775 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.159092 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.268073 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" path="/var/lib/kubelet/pods/0587eb58-cd5e-4e0b-be30-97e0a569fc57/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.270179 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" path="/var/lib/kubelet/pods/0795b626-b382-4b9b-beb5-802cebc4f764/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.271965 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" path="/var/lib/kubelet/pods/2af42784-d5cc-4f7c-832a-f91dbd54cc3f/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.273889 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" path="/var/lib/kubelet/pods/79c00857-0d6a-4c12-8581-da16e2a24f04/volumes" Mar 20 13:59:06 crc kubenswrapper[4755]: I0320 13:59:06.751162 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:06 crc kubenswrapper[4755]: I0320 13:59:06.752296 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.034760 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.043358 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.237023 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" path="/var/lib/kubelet/pods/46d041c2-e231-49fd-9d88-a991a1b9dd65/volumes" Mar 20 13:59:08 crc kubenswrapper[4755]: I0320 13:59:08.033163 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:59:08 crc kubenswrapper[4755]: I0320 13:59:08.041344 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:59:09 crc kubenswrapper[4755]: I0320 13:59:09.242257 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" path="/var/lib/kubelet/pods/6fe77db3-29ef-42ae-840b-9736f07188ca/volumes" Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.919055 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" exitCode=0 Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.919110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerDied","Data":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.920322 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:16 crc kubenswrapper[4755]: I0320 13:59:16.959443 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/gather/0.log" Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.980097 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.980975 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" containerID="cri-o://5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" gracePeriod=2 Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.992694 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.487560 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/copy/0.log" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.488555 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.654497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.654705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.666488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh" (OuterVolumeSpecName: "kube-api-access-8wmwh") pod "9e7e4d4d-749a-4ec8-89f4-1362f7787e43" (UID: "9e7e4d4d-749a-4ec8-89f4-1362f7787e43"). InnerVolumeSpecName "kube-api-access-8wmwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.756861 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.798921 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9e7e4d4d-749a-4ec8-89f4-1362f7787e43" (UID: "9e7e4d4d-749a-4ec8-89f4-1362f7787e43"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.862454 4755 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023470 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/copy/0.log" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023890 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" exitCode=143 Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023942 4755 scope.go:117] "RemoveContainer" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.024096 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.043473 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.092887 4755 scope.go:117] "RemoveContainer" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: E0320 13:59:26.093355 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": container with ID starting with 5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4 not found: ID does not exist" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093399 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4"} err="failed to get container status \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": rpc error: code = NotFound desc = could not find container \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": container with ID starting with 5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4 not found: ID does not exist" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093424 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: E0320 13:59:26.093871 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": container with ID starting with e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde not found: ID does not exist" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093900 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} err="failed to get container status \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": rpc error: code = NotFound desc = could not find container \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": container with ID starting with e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde not found: ID does not exist" Mar 20 13:59:27 crc kubenswrapper[4755]: I0320 13:59:27.260441 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" path="/var/lib/kubelet/pods/9e7e4d4d-749a-4ec8-89f4-1362f7787e43/volumes" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.053716 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.064875 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.459695 4755 scope.go:117] "RemoveContainer" containerID="674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.502674 4755 scope.go:117] "RemoveContainer" containerID="618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.558478 4755 scope.go:117] "RemoveContainer" containerID="80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.600213 4755 scope.go:117] "RemoveContainer" containerID="0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.646982 4755 scope.go:117] "RemoveContainer" containerID="ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.716262 4755 scope.go:117] "RemoveContainer" containerID="9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955" Mar 20 13:59:29 crc kubenswrapper[4755]: I0320 13:59:29.245105 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" path="/var/lib/kubelet/pods/8ae45e95-b96a-4157-a584-a6eb321d5091/volumes" Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.031800 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.055574 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.063062 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.102945 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.116999 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.123886 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.130100 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.138376 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.144699 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.150853 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.751616 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.751710 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.241727 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" path="/var/lib/kubelet/pods/015c8ae7-1856-4b0c-b5ce-e2503a2080dc/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.242551 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" path="/var/lib/kubelet/pods/5dde547e-5fce-4868-ba0e-63650ea0c771/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.248477 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" path="/var/lib/kubelet/pods/8c5d05dc-a589-4d2e-9374-0d57202a3cfc/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.249131 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" path="/var/lib/kubelet/pods/e38d31ac-eae6-4cd1-be04-304215db852a/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.249777 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" path="/var/lib/kubelet/pods/feb55e83-711d-4561-8b57-2a231944e1b1/volumes" Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.047285 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.064362 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.081833 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.096230 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.239647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" path="/var/lib/kubelet/pods/3047e6fe-5128-4361-bede-e9f0c4e9387c/volumes" Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.240399 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" path="/var/lib/kubelet/pods/34c85756-25cf-4302-bd5d-72f2e459f562/volumes" Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.038850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.047398 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.248401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" path="/var/lib/kubelet/pods/64ad8e64-0606-4171-bd2d-ae8212fdff8f/volumes" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.166420 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167417 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167462 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167499 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167508 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167539 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167548 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167877 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167905 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167924 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.168941 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.171330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.172246 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.191207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.268768 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.271395 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.273774 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.274843 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.274882 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.279578 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.319837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.320761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.320928 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.424934 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.439124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.453902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.511379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.525724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.561170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.602166 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.993040 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.135476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:01 crc kubenswrapper[4755]: W0320 14:00:01.145631 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07cef7a_c5f6_4f4b_8508_8d499928b255.slice/crio-832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14 WatchSource:0}: Error finding container 832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14: Status 404 returned error can't find the container with id 832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14 Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.419438 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerStarted","Data":"832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14"} Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422486 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerID="c6bc10e055ad85ee77ba59e601c3da9aa174ba7f0b28a4725b0f849a16bdfa98" exitCode=0 Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422520 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerDied","Data":"c6bc10e055ad85ee77ba59e601c3da9aa174ba7f0b28a4725b0f849a16bdfa98"} Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerStarted","Data":"6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4"} Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.922753 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986472 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.987504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.993595 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.995864 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq" (OuterVolumeSpecName: "kube-api-access-vqdzq") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "kube-api-access-vqdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088859 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088898 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088910 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.450546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerDied","Data":"6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4"} Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.451016 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.450634 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751051 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751563 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751635 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.752986 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.753094 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" gracePeriod=600 Mar 20 14:00:06 crc kubenswrapper[4755]: E0320 14:00:06.896533 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495573 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" exitCode=0 Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495760 4755 scope.go:117] "RemoveContainer" containerID="8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.497178 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:07 crc kubenswrapper[4755]: E0320 14:00:07.498115 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.035813 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.046074 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.244827 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" path="/var/lib/kubelet/pods/69707be4-e338-4e13-8ecc-8cfd7cd416b2/volumes" Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.606428 4755 generic.go:334] "Generic (PLEG): container finished" podID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerID="be6f17dea20c6888dee20766234195b91965ff23909e1764b2d47f7abaf02c60" exitCode=0 Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.606493 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerDied","Data":"be6f17dea20c6888dee20766234195b91965ff23909e1764b2d47f7abaf02c60"} Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.005683 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.019557 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"c07cef7a-c5f6-4f4b-8508-8d499928b255\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.025148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf" (OuterVolumeSpecName: "kube-api-access-ns8vf") pod "c07cef7a-c5f6-4f4b-8508-8d499928b255" (UID: "c07cef7a-c5f6-4f4b-8508-8d499928b255"). InnerVolumeSpecName "kube-api-access-ns8vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.122277 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643324 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerDied","Data":"832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14"} Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643610 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643388 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:18 crc kubenswrapper[4755]: I0320 14:00:18.073543 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 14:00:18 crc kubenswrapper[4755]: I0320 14:00:18.084942 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 14:00:19 crc kubenswrapper[4755]: I0320 14:00:19.237108 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" path="/var/lib/kubelet/pods/ea7c11fe-b29d-4fa4-a46d-7079105e883e/volumes" Mar 20 14:00:21 crc kubenswrapper[4755]: I0320 14:00:21.235316 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:21 crc kubenswrapper[4755]: E0320 14:00:21.236825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.042043 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.051951 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.243153 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" path="/var/lib/kubelet/pods/5dddb768-c318-44b8-bac9-ea26f29ca038/volumes" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.859190 4755 scope.go:117] "RemoveContainer" containerID="46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.902300 4755 scope.go:117] "RemoveContainer" containerID="705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.954074 4755 scope.go:117] "RemoveContainer" containerID="357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.996288 4755 scope.go:117] "RemoveContainer" containerID="291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.033475 4755 scope.go:117] "RemoveContainer" containerID="035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.066350 4755 scope.go:117] "RemoveContainer" containerID="02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.102380 4755 scope.go:117] "RemoveContainer" containerID="cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.125565 4755 scope.go:117] "RemoveContainer" containerID="df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.167131 4755 scope.go:117] "RemoveContainer" containerID="12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.198273 4755 scope.go:117] "RemoveContainer" containerID="60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.218758 4755 scope.go:117] "RemoveContainer" containerID="b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.257116 4755 scope.go:117] "RemoveContainer" containerID="e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.296222 4755 scope.go:117] "RemoveContainer" containerID="c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8" Mar 20 14:00:32 crc kubenswrapper[4755]: I0320 14:00:32.029953 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 14:00:32 crc kubenswrapper[4755]: I0320 14:00:32.037556 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 14:00:33 crc kubenswrapper[4755]: I0320 14:00:33.238891 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" path="/var/lib/kubelet/pods/7ea35a84-68ca-4490-b1d9-fa999ef63ebe/volumes" Mar 20 14:00:35 crc kubenswrapper[4755]: I0320 14:00:35.226331 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:35 crc kubenswrapper[4755]: E0320 14:00:35.227236 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:38 crc kubenswrapper[4755]: I0320 14:00:38.025834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 14:00:38 crc kubenswrapper[4755]: I0320 14:00:38.032939 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 14:00:39 crc kubenswrapper[4755]: I0320 14:00:39.238780 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" path="/var/lib/kubelet/pods/95c76f8c-7b76-4714-adac-6297b84d6492/volumes" Mar 20 14:00:40 crc kubenswrapper[4755]: I0320 14:00:40.036434 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 14:00:40 crc kubenswrapper[4755]: I0320 14:00:40.051356 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 14:00:41 crc kubenswrapper[4755]: I0320 14:00:41.246410 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" path="/var/lib/kubelet/pods/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0/volumes" Mar 20 14:00:48 crc kubenswrapper[4755]: I0320 14:00:48.226163 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:48 crc kubenswrapper[4755]: E0320 14:00:48.227075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.143835 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.144831 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.144851 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.144871 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.144879 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.145104 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.145139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.170629 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.170816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.225976 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.226463 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287709 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287756 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390490 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.391611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.399693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.401313 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.402298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.418037 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.517223 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.986607 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.516576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerStarted","Data":"e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065"} Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.516989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerStarted","Data":"2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144"} Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.534995 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566921-crbwk" podStartSLOduration=1.5349696800000001 podStartE2EDuration="1.53496968s" podCreationTimestamp="2026-03-20 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:01:01.533785437 +0000 UTC m=+1841.131717976" watchObservedRunningTime="2026-03-20 14:01:01.53496968 +0000 UTC m=+1841.132902249" Mar 20 14:01:03 crc kubenswrapper[4755]: I0320 14:01:03.538824 4755 generic.go:334] "Generic (PLEG): container finished" podID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerID="e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065" exitCode=0 Mar 20 14:01:03 crc kubenswrapper[4755]: I0320 14:01:03.538908 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerDied","Data":"e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065"} Mar 20 14:01:04 crc kubenswrapper[4755]: I0320 14:01:04.928205 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082546 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082768 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.090917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.093074 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr" (OuterVolumeSpecName: "kube-api-access-bz6fr") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "kube-api-access-bz6fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.143995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.147180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data" (OuterVolumeSpecName: "config-data") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188734 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188783 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188896 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188910 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575430 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerDied","Data":"2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144"} Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575490 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575571 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:13 crc kubenswrapper[4755]: I0320 14:01:13.226366 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:13 crc kubenswrapper[4755]: E0320 14:01:13.227558 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:24 crc kubenswrapper[4755]: I0320 14:01:24.226252 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:24 crc kubenswrapper[4755]: E0320 14:01:24.227306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.043992 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.054565 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.066262 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.078159 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.088302 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.098341 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.242803 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" path="/var/lib/kubelet/pods/0deb3f1a-0cad-4429-9e79-38e5a0b38896/volumes" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.244004 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" path="/var/lib/kubelet/pods/32a5606c-c777-4c0b-951c-6ce2e03edd7e/volumes" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.245241 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f395acec-f28b-4622-b349-127cf31ec92d" path="/var/lib/kubelet/pods/f395acec-f28b-4622-b349-127cf31ec92d/volumes" Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.052822 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.065496 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.074117 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.082546 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.090605 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.097268 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.243350 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" path="/var/lib/kubelet/pods/03accbff-bdf2-4256-bdf2-1b39d5485673/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.244106 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" path="/var/lib/kubelet/pods/39991203-9b8d-4985-8e90-b3d1772f6b8f/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.244822 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" path="/var/lib/kubelet/pods/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.533819 4755 scope.go:117] "RemoveContainer" containerID="d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.563118 4755 scope.go:117] "RemoveContainer" containerID="08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.620176 4755 scope.go:117] "RemoveContainer" containerID="a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.663089 4755 scope.go:117] "RemoveContainer" containerID="34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.715316 4755 scope.go:117] "RemoveContainer" containerID="f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.752533 4755 scope.go:117] "RemoveContainer" containerID="8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.834284 4755 scope.go:117] "RemoveContainer" containerID="88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.884191 4755 scope.go:117] "RemoveContainer" containerID="d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.929114 4755 scope.go:117] "RemoveContainer" containerID="d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7" Mar 20 14:01:39 crc kubenswrapper[4755]: I0320 14:01:39.226343 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:39 crc kubenswrapper[4755]: E0320 14:01:39.227525 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:50 crc kubenswrapper[4755]: I0320 14:01:50.225806 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:50 crc kubenswrapper[4755]: E0320 14:01:50.226655 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:56 crc kubenswrapper[4755]: I0320 14:01:56.055372 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 14:01:56 crc kubenswrapper[4755]: I0320 14:01:56.066768 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 14:01:57 crc kubenswrapper[4755]: I0320 14:01:57.244299 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" path="/var/lib/kubelet/pods/faef786e-b221-4fff-8d48-42b8163ed86a/volumes" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.149286 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:00 crc kubenswrapper[4755]: E0320 14:02:00.150122 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.150141 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.150379 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.151457 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155790 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.167188 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.301749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.403724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.430822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.525121 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.046528 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.052169 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.244326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerStarted","Data":"61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4"} Mar 20 14:02:03 crc kubenswrapper[4755]: I0320 14:02:03.271149 4755 generic.go:334] "Generic (PLEG): container finished" podID="a0bc6168-1eb1-41f8-8921-70564488dc62" containerID="68a0b051f32e2cc882ef284df84e75dea8c9605ead333a40bfbbc39dbec1e0c1" exitCode=0 Mar 20 14:02:03 crc kubenswrapper[4755]: I0320 14:02:03.271242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerDied","Data":"68a0b051f32e2cc882ef284df84e75dea8c9605ead333a40bfbbc39dbec1e0c1"} Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.226282 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:04 crc kubenswrapper[4755]: E0320 14:02:04.226639 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.663613 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.803547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"a0bc6168-1eb1-41f8-8921-70564488dc62\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.812797 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb" (OuterVolumeSpecName: "kube-api-access-s6qvb") pod "a0bc6168-1eb1-41f8-8921-70564488dc62" (UID: "a0bc6168-1eb1-41f8-8921-70564488dc62"). InnerVolumeSpecName "kube-api-access-s6qvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.905074 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerDied","Data":"61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4"} Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290507 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290520 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.781849 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.793871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 14:02:07 crc kubenswrapper[4755]: I0320 14:02:07.247166 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" path="/var/lib/kubelet/pods/c74d4c86-05c3-4ac3-a18e-cb75b4d95559/volumes" Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.041729 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.050488 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.245337 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" path="/var/lib/kubelet/pods/2ff73477-b65b-4362-938c-94b1bb1f51b0/volumes" Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.031964 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.039032 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.226371 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:16 crc kubenswrapper[4755]: E0320 14:02:16.226899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:17 crc kubenswrapper[4755]: I0320 14:02:17.240337 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" path="/var/lib/kubelet/pods/cadbdc7c-ed66-43d7-82ee-d797beb959a8/volumes" Mar 20 14:02:27 crc kubenswrapper[4755]: I0320 14:02:27.227196 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:27 crc kubenswrapper[4755]: E0320 14:02:27.228382 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.066048 4755 scope.go:117] "RemoveContainer" containerID="4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.116886 4755 scope.go:117] "RemoveContainer" containerID="ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.168535 4755 scope.go:117] "RemoveContainer" containerID="109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.217460 4755 scope.go:117] "RemoveContainer" containerID="8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718" Mar 20 14:02:42 crc kubenswrapper[4755]: I0320 14:02:42.270930 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:42 crc kubenswrapper[4755]: E0320 14:02:42.271865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:55 crc kubenswrapper[4755]: I0320 14:02:55.226183 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:55 crc kubenswrapper[4755]: E0320 14:02:55.229298 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:58 crc kubenswrapper[4755]: I0320 14:02:58.055782 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 14:02:58 crc kubenswrapper[4755]: I0320 14:02:58.063277 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 14:02:59 crc kubenswrapper[4755]: I0320 14:02:59.245647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557c5385-782c-410a-a371-b27f41d88a47" path="/var/lib/kubelet/pods/557c5385-782c-410a-a371-b27f41d88a47/volumes" Mar 20 14:03:10 crc kubenswrapper[4755]: I0320 14:03:10.226467 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:03:10 crc kubenswrapper[4755]: E0320 14:03:10.227527 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c"